Dec 01 09:07:57 crc systemd[1]: Starting Kubernetes Kubelet... Dec 01 09:07:57 crc restorecon[4580]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:57 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:07:58 crc restorecon[4580]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:07:58 crc restorecon[4580]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 01 09:07:58 crc kubenswrapper[4867]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 09:07:58 crc kubenswrapper[4867]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 01 09:07:58 crc kubenswrapper[4867]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 09:07:58 crc kubenswrapper[4867]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 09:07:58 crc kubenswrapper[4867]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 01 09:07:58 crc kubenswrapper[4867]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.672873 4867 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676642 4867 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676672 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676680 4867 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676688 4867 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676696 4867 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676704 4867 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676712 4867 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676718 4867 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676725 4867 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676731 4867 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676737 4867 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676743 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676749 4867 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676757 4867 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676765 4867 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676772 4867 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676778 4867 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676784 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676790 4867 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676806 4867 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676834 4867 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676841 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676847 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676853 4867 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676860 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676867 4867 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676872 4867 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676879 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676885 4867 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676891 4867 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676899 4867 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676908 4867 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676917 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676926 4867 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676936 4867 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676943 4867 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676952 4867 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676959 4867 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676967 4867 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676975 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676981 4867 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676987 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.676995 4867 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677001 4867 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677009 4867 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677014 4867 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677022 4867 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677027 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677033 4867 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677039 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677046 4867 feature_gate.go:330] unrecognized feature gate: Example Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677054 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677060 4867 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677067 4867 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677072 4867 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677080 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677086 4867 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677093 4867 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677099 4867 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677105 4867 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677110 4867 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677116 4867 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677122 4867 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677127 4867 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677133 4867 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677139 4867 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677145 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677152 4867 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677157 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677163 4867 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.677168 4867 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677506 4867 flags.go:64] FLAG: --address="0.0.0.0" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677528 4867 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677541 4867 flags.go:64] FLAG: --anonymous-auth="true" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677550 4867 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677560 4867 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677567 4867 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677578 4867 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677588 4867 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677595 4867 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677602 4867 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677609 4867 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677616 4867 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677624 4867 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677631 4867 flags.go:64] FLAG: --cgroup-root="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677637 4867 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677645 4867 flags.go:64] FLAG: --client-ca-file="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677653 4867 flags.go:64] FLAG: --cloud-config="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677660 4867 flags.go:64] FLAG: --cloud-provider="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677667 4867 flags.go:64] FLAG: --cluster-dns="[]" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677678 4867 flags.go:64] FLAG: --cluster-domain="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677685 4867 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677692 4867 flags.go:64] FLAG: --config-dir="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677699 4867 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677707 4867 flags.go:64] FLAG: --container-log-max-files="5" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677717 4867 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677724 4867 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677731 4867 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677738 4867 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677746 4867 flags.go:64] FLAG: --contention-profiling="false" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677753 4867 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677759 4867 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677767 4867 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677776 4867 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677785 4867 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677792 4867 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677799 4867 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677833 4867 flags.go:64] FLAG: --enable-load-reader="false" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677842 4867 flags.go:64] FLAG: --enable-server="true" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677850 4867 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677868 4867 flags.go:64] FLAG: --event-burst="100" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677875 4867 flags.go:64] FLAG: --event-qps="50" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677882 4867 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677889 4867 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677896 4867 flags.go:64] FLAG: --eviction-hard="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677911 4867 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677918 4867 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677925 4867 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677933 4867 flags.go:64] FLAG: --eviction-soft="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677941 4867 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677948 4867 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677955 4867 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677963 4867 flags.go:64] FLAG: --experimental-mounter-path="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677970 4867 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677993 4867 flags.go:64] FLAG: --fail-swap-on="true" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.677999 4867 flags.go:64] FLAG: --feature-gates="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678008 4867 flags.go:64] FLAG: --file-check-frequency="20s" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678015 4867 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678021 4867 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678028 4867 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678035 4867 flags.go:64] FLAG: --healthz-port="10248" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678042 4867 flags.go:64] FLAG: --help="false" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678049 4867 flags.go:64] FLAG: --hostname-override="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678055 4867 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678062 4867 flags.go:64] FLAG: --http-check-frequency="20s" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678070 4867 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678076 4867 flags.go:64] FLAG: --image-credential-provider-config="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678083 4867 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678090 4867 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678105 4867 flags.go:64] FLAG: --image-service-endpoint="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678114 4867 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678120 4867 flags.go:64] FLAG: --kube-api-burst="100" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678127 4867 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678135 4867 flags.go:64] FLAG: --kube-api-qps="50" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678142 4867 flags.go:64] FLAG: --kube-reserved="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678149 4867 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678155 4867 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678163 4867 flags.go:64] FLAG: --kubelet-cgroups="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678171 4867 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678178 4867 flags.go:64] FLAG: --lock-file="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678185 4867 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678193 4867 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678200 4867 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678212 4867 flags.go:64] FLAG: --log-json-split-stream="false" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678220 4867 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678227 4867 flags.go:64] FLAG: --log-text-split-stream="false" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678234 4867 flags.go:64] FLAG: --logging-format="text" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678241 4867 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678249 4867 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678257 4867 flags.go:64] FLAG: --manifest-url="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678264 4867 flags.go:64] FLAG: --manifest-url-header="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678274 4867 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678281 4867 flags.go:64] FLAG: --max-open-files="1000000" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678290 4867 flags.go:64] FLAG: --max-pods="110" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678297 4867 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678304 4867 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678311 4867 flags.go:64] FLAG: --memory-manager-policy="None" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678318 4867 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678325 4867 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678332 4867 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678339 4867 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678358 4867 flags.go:64] FLAG: --node-status-max-images="50" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678365 4867 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678372 4867 flags.go:64] FLAG: --oom-score-adj="-999" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678380 4867 flags.go:64] FLAG: --pod-cidr="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678388 4867 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678400 4867 flags.go:64] FLAG: --pod-manifest-path="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678409 4867 flags.go:64] FLAG: --pod-max-pids="-1" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678416 4867 flags.go:64] FLAG: --pods-per-core="0" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678423 4867 flags.go:64] FLAG: --port="10250" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678432 4867 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678440 4867 flags.go:64] FLAG: --provider-id="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678447 4867 flags.go:64] FLAG: --qos-reserved="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678454 4867 flags.go:64] FLAG: --read-only-port="10255" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678461 4867 flags.go:64] FLAG: --register-node="true" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678468 4867 flags.go:64] FLAG: --register-schedulable="true" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678475 4867 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678490 4867 flags.go:64] FLAG: --registry-burst="10" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678497 4867 flags.go:64] FLAG: --registry-qps="5" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678504 4867 flags.go:64] FLAG: --reserved-cpus="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678511 4867 flags.go:64] FLAG: --reserved-memory="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678519 4867 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678527 4867 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678534 4867 flags.go:64] FLAG: --rotate-certificates="false" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678541 4867 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678548 4867 flags.go:64] FLAG: --runonce="false" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678555 4867 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678563 4867 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678570 4867 flags.go:64] FLAG: --seccomp-default="false" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678577 4867 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678584 4867 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678591 4867 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678599 4867 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678606 4867 flags.go:64] FLAG: --storage-driver-password="root" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678613 4867 flags.go:64] FLAG: --storage-driver-secure="false" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678620 4867 flags.go:64] FLAG: --storage-driver-table="stats" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678627 4867 flags.go:64] FLAG: --storage-driver-user="root" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678634 4867 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678642 4867 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678650 4867 flags.go:64] FLAG: --system-cgroups="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678656 4867 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678670 4867 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678678 4867 flags.go:64] FLAG: --tls-cert-file="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678685 4867 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678695 4867 flags.go:64] FLAG: --tls-min-version="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678702 4867 flags.go:64] FLAG: --tls-private-key-file="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678709 4867 flags.go:64] FLAG: --topology-manager-policy="none" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678717 4867 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678724 4867 flags.go:64] FLAG: --topology-manager-scope="container" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678732 4867 flags.go:64] FLAG: --v="2" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678742 4867 flags.go:64] FLAG: --version="false" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678752 4867 flags.go:64] FLAG: --vmodule="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678761 4867 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.678769 4867 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679002 4867 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679016 4867 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679023 4867 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679030 4867 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679036 4867 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679044 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679050 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679056 4867 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679062 4867 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679069 4867 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679075 4867 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679082 4867 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679088 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679094 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679101 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679108 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679113 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679120 4867 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679125 4867 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679131 4867 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679140 4867 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679148 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679155 4867 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679174 4867 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679182 4867 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679189 4867 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679196 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679202 4867 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679208 4867 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679214 4867 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679221 4867 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679227 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679232 4867 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679240 4867 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679248 4867 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679254 4867 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679261 4867 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679269 4867 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679277 4867 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679284 4867 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679290 4867 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679297 4867 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679305 4867 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679311 4867 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679317 4867 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679323 4867 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679329 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679336 4867 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679342 4867 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679348 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679354 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679360 4867 feature_gate.go:330] unrecognized feature gate: Example Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679367 4867 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679372 4867 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679378 4867 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679384 4867 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679390 4867 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679404 4867 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679410 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679427 4867 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679433 4867 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679440 4867 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679446 4867 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679452 4867 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679458 4867 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679463 4867 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679469 4867 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679475 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679481 4867 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679488 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.679496 4867 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.679516 4867 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.691720 4867 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.691770 4867 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.691986 4867 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692002 4867 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692011 4867 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692020 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692028 4867 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692036 4867 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692044 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692052 4867 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692060 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692072 4867 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692084 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692094 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692102 4867 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692111 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692119 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692126 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692134 4867 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692142 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692149 4867 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692157 4867 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692165 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692173 4867 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692181 4867 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692190 4867 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692197 4867 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692205 4867 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692212 4867 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692220 4867 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692228 4867 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692235 4867 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692243 4867 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692251 4867 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692259 4867 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692266 4867 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692285 4867 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692294 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692301 4867 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692309 4867 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692319 4867 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692332 4867 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692340 4867 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692351 4867 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692361 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692370 4867 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692379 4867 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692387 4867 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692394 4867 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692403 4867 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692410 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692418 4867 feature_gate.go:330] unrecognized feature gate: Example Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692426 4867 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692433 4867 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692441 4867 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692449 4867 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692456 4867 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692464 4867 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692472 4867 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692479 4867 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692487 4867 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692494 4867 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692502 4867 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692510 4867 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692517 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692525 4867 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692532 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692543 4867 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692552 4867 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692560 4867 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692568 4867 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692576 4867 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692594 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.692608 4867 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692938 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692951 4867 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692960 4867 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.692998 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693008 4867 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693016 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693024 4867 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693034 4867 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693042 4867 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693050 4867 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693058 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693066 4867 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693073 4867 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693081 4867 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693089 4867 feature_gate.go:330] unrecognized feature gate: Example Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693096 4867 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693104 4867 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693111 4867 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693119 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693127 4867 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693135 4867 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693143 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693150 4867 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693158 4867 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693166 4867 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693173 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693181 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693188 4867 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693196 4867 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693204 4867 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693212 4867 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693219 4867 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693228 4867 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693235 4867 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693254 4867 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693262 4867 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693272 4867 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693282 4867 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693290 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693298 4867 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693306 4867 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693314 4867 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693322 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693329 4867 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693340 4867 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693349 4867 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693357 4867 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693365 4867 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693373 4867 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693381 4867 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693389 4867 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693396 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693404 4867 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693412 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693419 4867 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693427 4867 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693434 4867 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693442 4867 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693452 4867 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693462 4867 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693470 4867 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693478 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693487 4867 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693495 4867 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693503 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693511 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693521 4867 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693529 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693537 4867 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693545 4867 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.693571 4867 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.693584 4867 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.693878 4867 server.go:940] "Client rotation is on, will bootstrap in background" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.698893 4867 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.699068 4867 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.700171 4867 server.go:997] "Starting client certificate rotation" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.700219 4867 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.700751 4867 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-02 05:26:32.392414264 +0000 UTC Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.700917 4867 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 20h18m33.691505059s for next certificate rotation Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.710442 4867 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.712703 4867 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.720123 4867 log.go:25] "Validated CRI v1 runtime API" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.741887 4867 log.go:25] "Validated CRI v1 image API" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.745912 4867 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.748805 4867 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-01-09-00-59-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.748922 4867 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.758284 4867 manager.go:217] Machine: {Timestamp:2025-12-01 09:07:58.757365019 +0000 UTC m=+0.216751793 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2799998 MemoryCapacity:25199480832 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:6a9666c0-d065-46a2-bf0b-9da61e045701 BootID:8a65d7c2-3f9a-40e7-a739-7e76b1a2f333 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599742464 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:d3:3c:63 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:d3:3c:63 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:f7:c8:81 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:d5:f6:fb Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:74:0e:b3 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:61:7f:11 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:e6:ab:de:8f:f3:a9 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:5a:83:b4:be:27:89 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199480832 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.758703 4867 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.758908 4867 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.759337 4867 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.759620 4867 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.759737 4867 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.760079 4867 topology_manager.go:138] "Creating topology manager with none policy" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.760191 4867 container_manager_linux.go:303] "Creating device plugin manager" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.760492 4867 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.760646 4867 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.760936 4867 state_mem.go:36] "Initialized new in-memory state store" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.761071 4867 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.761724 4867 kubelet.go:418] "Attempting to sync node with API server" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.761852 4867 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.761998 4867 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.762114 4867 kubelet.go:324] "Adding apiserver pod source" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.762181 4867 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.766841 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Dec 01 09:07:58 crc kubenswrapper[4867]: E1201 09:07:58.767021 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.767441 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Dec 01 09:07:58 crc kubenswrapper[4867]: E1201 09:07:58.767504 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.767980 4867 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.768796 4867 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.769574 4867 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.770254 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.770280 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.770291 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.770301 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.770314 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.770322 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.770332 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.770344 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.770354 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.770363 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.770389 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.770398 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.770619 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.771087 4867 server.go:1280] "Started kubelet" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.771578 4867 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.771945 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.771586 4867 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 01 09:07:58 crc systemd[1]: Started Kubernetes Kubelet. Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.773762 4867 server.go:460] "Adding debug handlers to kubelet server" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.774028 4867 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.774078 4867 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 01 09:07:58 crc kubenswrapper[4867]: E1201 09:07:58.774768 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.774937 4867 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.774978 4867 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.775015 4867 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.775073 4867 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.774941 4867 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 08:38:19.42049202 +0000 UTC Dec 01 09:07:58 crc kubenswrapper[4867]: E1201 09:07:58.775909 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="200ms" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.780519 4867 factory.go:153] Registering CRI-O factory Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.780550 4867 factory.go:221] Registration of the crio container factory successfully Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.780619 4867 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.780628 4867 factory.go:55] Registering systemd factory Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.780634 4867 factory.go:221] Registration of the systemd container factory successfully Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.780649 4867 factory.go:103] Registering Raw factory Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.780624 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.780660 4867 manager.go:1196] Started watching for new ooms in manager Dec 01 09:07:58 crc kubenswrapper[4867]: E1201 09:07:58.780694 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.783502 4867 manager.go:319] Starting recovery of all containers Dec 01 09:07:58 crc kubenswrapper[4867]: E1201 09:07:58.776744 4867 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.224:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d0c3b84e12fd5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 09:07:58.771056597 +0000 UTC m=+0.230443361,LastTimestamp:2025-12-01 09:07:58.771056597 +0000 UTC m=+0.230443361,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791003 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791070 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791089 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791103 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791122 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791138 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791156 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791172 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791192 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791206 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791220 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791238 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791251 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791272 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791285 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791302 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791314 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791722 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791769 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791782 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791793 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791805 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791828 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791837 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791851 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791860 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791875 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791886 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.791897 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.794162 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.794202 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.794217 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.794230 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.794242 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.794254 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.794267 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.794280 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.794289 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.794299 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795179 4867 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795216 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795229 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795240 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795250 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795259 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795273 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795282 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795292 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795302 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795312 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795323 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795361 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795374 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795413 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795428 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795440 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795450 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795460 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795469 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795478 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795487 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795514 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795522 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795533 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795543 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795565 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795575 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795585 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795594 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795614 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795623 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795632 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795643 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795652 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795661 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795682 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795694 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795718 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795754 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795763 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795772 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795783 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795828 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795840 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795850 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795870 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795880 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795889 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795899 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795910 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795919 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795929 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795938 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795960 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795969 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795979 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795988 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.795996 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796004 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796013 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796022 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796041 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796050 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796058 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796068 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796118 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796128 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796147 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796157 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796177 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796186 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796197 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796206 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796216 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796225 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796235 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796244 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796264 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796273 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796283 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796292 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796302 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796311 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796321 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796331 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796351 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796361 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796371 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796379 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796388 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796397 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796407 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796415 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796427 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796437 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796447 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796456 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796465 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796475 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796484 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796493 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796515 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796524 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796535 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796544 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796552 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796560 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796569 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796578 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796595 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796604 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796612 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796620 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796630 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796640 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796649 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796658 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796680 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796690 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796699 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796708 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796738 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796748 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796757 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796766 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796796 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796805 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796851 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796861 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796874 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796883 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796892 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796900 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796912 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796921 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796930 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796938 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796946 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796956 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796964 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796972 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796984 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.796991 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797000 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797009 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797021 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797030 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797039 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797048 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797059 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797068 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797076 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797085 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797093 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797101 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797109 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797119 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797130 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797138 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797151 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797160 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797173 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797186 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797194 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797203 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797216 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797224 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797233 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797241 4867 reconstruct.go:97] "Volume reconstruction finished" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.797248 4867 reconciler.go:26] "Reconciler: start to sync state" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.809504 4867 manager.go:324] Recovery completed Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.821351 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.823434 4867 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.823886 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.823936 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.823946 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.824842 4867 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.824912 4867 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.824979 4867 state_mem.go:36] "Initialized new in-memory state store" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.825663 4867 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.825709 4867 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.825739 4867 kubelet.go:2335] "Starting kubelet main sync loop" Dec 01 09:07:58 crc kubenswrapper[4867]: E1201 09:07:58.825790 4867 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 01 09:07:58 crc kubenswrapper[4867]: W1201 09:07:58.826934 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Dec 01 09:07:58 crc kubenswrapper[4867]: E1201 09:07:58.827054 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:07:58 crc kubenswrapper[4867]: E1201 09:07:58.875697 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 09:07:58 crc kubenswrapper[4867]: E1201 09:07:58.926744 4867 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.936481 4867 policy_none.go:49] "None policy: Start" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.938156 4867 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.938344 4867 state_mem.go:35] "Initializing new in-memory state store" Dec 01 09:07:58 crc kubenswrapper[4867]: E1201 09:07:58.976140 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 09:07:58 crc kubenswrapper[4867]: E1201 09:07:58.978645 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="400ms" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.986086 4867 manager.go:334] "Starting Device Plugin manager" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.986195 4867 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.986217 4867 server.go:79] "Starting device plugin registration server" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.986710 4867 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.986733 4867 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.986998 4867 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.987226 4867 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 01 09:07:58 crc kubenswrapper[4867]: I1201 09:07:58.987315 4867 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 01 09:07:58 crc kubenswrapper[4867]: E1201 09:07:58.996304 4867 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.087777 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.089491 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.089524 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.089533 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.089560 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 09:07:59 crc kubenswrapper[4867]: E1201 09:07:59.090467 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Dec 01 09:07:59 crc kubenswrapper[4867]: E1201 09:07:59.094569 4867 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.224:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d0c3b84e12fd5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 09:07:58.771056597 +0000 UTC m=+0.230443361,LastTimestamp:2025-12-01 09:07:58.771056597 +0000 UTC m=+0.230443361,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.127159 4867 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.127322 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.129289 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.129333 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.129348 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.129499 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.129639 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.129690 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.130673 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.130699 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.130709 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.131055 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.131140 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.131200 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.131650 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.131708 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.132080 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.133338 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.133381 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.133395 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.137069 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.137123 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.137143 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.137378 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.137660 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.137703 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.138391 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.138423 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.138434 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.139367 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.139802 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.139829 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.139932 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.140113 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.140214 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.140590 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.140622 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.140640 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.140873 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.140903 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.141717 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.141739 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.141749 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.141937 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.141948 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.141959 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.204004 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.204080 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.204101 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.204121 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.204141 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.204159 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.204173 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.204206 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.204227 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.204244 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.204285 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.204300 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.204319 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.204339 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.204358 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.291042 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.292618 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.292651 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.292659 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.292690 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 09:07:59 crc kubenswrapper[4867]: E1201 09:07:59.293138 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306314 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306342 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306362 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306379 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306394 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306414 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306428 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306472 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306486 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306501 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306516 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306532 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306546 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306560 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306573 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306769 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306842 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306871 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306913 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306933 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306949 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306965 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.306987 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.307016 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.307026 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.307042 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.307065 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.307067 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.307113 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.307221 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: E1201 09:07:59.379685 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="800ms" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.477431 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.503549 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: W1201 09:07:59.509719 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-eb67f08cd903a226b454ae31c0ad24f3edae0e8b4b64aa254113b3ab6a8c74e6 WatchSource:0}: Error finding container eb67f08cd903a226b454ae31c0ad24f3edae0e8b4b64aa254113b3ab6a8c74e6: Status 404 returned error can't find the container with id eb67f08cd903a226b454ae31c0ad24f3edae0e8b4b64aa254113b3ab6a8c74e6 Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.518795 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: W1201 09:07:59.538239 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-c72271fb234b12ab3434ccfbd4191326fd0b92163fe44d83ab91733f95700083 WatchSource:0}: Error finding container c72271fb234b12ab3434ccfbd4191326fd0b92163fe44d83ab91733f95700083: Status 404 returned error can't find the container with id c72271fb234b12ab3434ccfbd4191326fd0b92163fe44d83ab91733f95700083 Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.546055 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: W1201 09:07:59.547544 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-30294b652628322e92483239e4372e940aa80b0343f23cc3cd7226ab0591faa9 WatchSource:0}: Error finding container 30294b652628322e92483239e4372e940aa80b0343f23cc3cd7226ab0591faa9: Status 404 returned error can't find the container with id 30294b652628322e92483239e4372e940aa80b0343f23cc3cd7226ab0591faa9 Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.557481 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:07:59 crc kubenswrapper[4867]: W1201 09:07:59.580777 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-6fa89204aa842b29805f0ae05da57f53a3b4682528803a15d3fe7e35ae761f31 WatchSource:0}: Error finding container 6fa89204aa842b29805f0ae05da57f53a3b4682528803a15d3fe7e35ae761f31: Status 404 returned error can't find the container with id 6fa89204aa842b29805f0ae05da57f53a3b4682528803a15d3fe7e35ae761f31 Dec 01 09:07:59 crc kubenswrapper[4867]: W1201 09:07:59.641932 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Dec 01 09:07:59 crc kubenswrapper[4867]: E1201 09:07:59.642026 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:07:59 crc kubenswrapper[4867]: W1201 09:07:59.687284 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Dec 01 09:07:59 crc kubenswrapper[4867]: E1201 09:07:59.687395 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.693739 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.695407 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.695458 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.695487 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.695520 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 09:07:59 crc kubenswrapper[4867]: E1201 09:07:59.696020 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.772666 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.776850 4867 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 20:01:11.651738354 +0000 UTC Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.776899 4867 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 634h53m11.874841685s for next certificate rotation Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.830363 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6fa89204aa842b29805f0ae05da57f53a3b4682528803a15d3fe7e35ae761f31"} Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.831583 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ca4d40fb4041ad866ac718ca9127970d3f2b3f4625dc95f3dfb7c11e30f18776"} Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.832667 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"30294b652628322e92483239e4372e940aa80b0343f23cc3cd7226ab0591faa9"} Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.834590 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c72271fb234b12ab3434ccfbd4191326fd0b92163fe44d83ab91733f95700083"} Dec 01 09:07:59 crc kubenswrapper[4867]: I1201 09:07:59.835978 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"eb67f08cd903a226b454ae31c0ad24f3edae0e8b4b64aa254113b3ab6a8c74e6"} Dec 01 09:08:00 crc kubenswrapper[4867]: W1201 09:08:00.061309 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Dec 01 09:08:00 crc kubenswrapper[4867]: E1201 09:08:00.061376 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:08:00 crc kubenswrapper[4867]: W1201 09:08:00.078399 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Dec 01 09:08:00 crc kubenswrapper[4867]: E1201 09:08:00.078473 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:08:00 crc kubenswrapper[4867]: E1201 09:08:00.181406 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="1.6s" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.496379 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.497628 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.497670 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.497681 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.497708 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 09:08:00 crc kubenswrapper[4867]: E1201 09:08:00.498177 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.772795 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.840553 4867 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="f745093342307a14715256ab35747f2570ae845e9374376cf169da702608dfa5" exitCode=0 Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.840649 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.840696 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"f745093342307a14715256ab35747f2570ae845e9374376cf169da702608dfa5"} Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.841547 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.841577 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.841588 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.845207 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0"} Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.845246 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772"} Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.845259 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6"} Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.845267 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.845270 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08"} Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.846337 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.846377 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.846393 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.847024 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba" exitCode=0 Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.847076 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba"} Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.847107 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.848438 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.848479 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.848496 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.848617 4867 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f" exitCode=0 Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.848699 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f"} Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.848764 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.849884 4867 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="58e153c5974daa706cb48413036395d69b97895ea274c6750a6f047eae26199f" exitCode=0 Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.849930 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"58e153c5974daa706cb48413036395d69b97895ea274c6750a6f047eae26199f"} Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.849959 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.850053 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.850093 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.850109 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.850856 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.851310 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.851337 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.851348 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.851654 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.851677 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:00 crc kubenswrapper[4867]: I1201 09:08:00.851686 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.857136 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b"} Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.857174 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a"} Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.857183 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa"} Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.857194 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d"} Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.858670 4867 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652" exitCode=0 Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.858706 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652"} Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.858799 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.859476 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.859494 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.859501 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.861363 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a61e584042c408ba8ab82818404e520d549af409a05af80fc95cd18dfedfe0d4"} Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.861419 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.862067 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.862085 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.862093 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.865051 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.865158 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.865427 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4fdea1a5a16c58da916f9b5af087a90ba2d498d7fb4f405ad1c53f9da05d0e96"} Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.865453 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0116ddbcf9fa956327663d364bab7e5374fc92b182321bdc348b4af4ac47d4c3"} Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.865463 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ebb7508abad81d77278c9f93d6a866ef80f310a48a179dff68d8d630c5a5cb8d"} Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.865987 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.866007 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.866014 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.866335 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.866358 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:01 crc kubenswrapper[4867]: I1201 09:08:01.866366 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.098997 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.100516 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.100554 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.100572 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.100602 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.504940 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.869218 4867 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6" exitCode=0 Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.869277 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6"} Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.869331 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.870929 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.870993 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.871010 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.874035 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169"} Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.874060 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.874160 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.874206 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.874162 4867 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.874295 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.875527 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.875551 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.875566 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.875581 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.875582 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.875768 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.876183 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.876211 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.876223 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.876210 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.876294 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:02 crc kubenswrapper[4867]: I1201 09:08:02.876315 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:03 crc kubenswrapper[4867]: I1201 09:08:03.880181 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:03 crc kubenswrapper[4867]: I1201 09:08:03.880649 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3"} Dec 01 09:08:03 crc kubenswrapper[4867]: I1201 09:08:03.880689 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:08:03 crc kubenswrapper[4867]: I1201 09:08:03.880705 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b"} Dec 01 09:08:03 crc kubenswrapper[4867]: I1201 09:08:03.880715 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3"} Dec 01 09:08:03 crc kubenswrapper[4867]: I1201 09:08:03.880729 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec"} Dec 01 09:08:03 crc kubenswrapper[4867]: I1201 09:08:03.881062 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:03 crc kubenswrapper[4867]: I1201 09:08:03.881085 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:03 crc kubenswrapper[4867]: I1201 09:08:03.881097 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:04 crc kubenswrapper[4867]: I1201 09:08:04.276013 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:08:04 crc kubenswrapper[4867]: I1201 09:08:04.422720 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:08:04 crc kubenswrapper[4867]: I1201 09:08:04.888964 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b"} Dec 01 09:08:04 crc kubenswrapper[4867]: I1201 09:08:04.889589 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:04 crc kubenswrapper[4867]: I1201 09:08:04.889376 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:04 crc kubenswrapper[4867]: I1201 09:08:04.892995 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:04 crc kubenswrapper[4867]: I1201 09:08:04.893051 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:04 crc kubenswrapper[4867]: I1201 09:08:04.893051 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:04 crc kubenswrapper[4867]: I1201 09:08:04.893093 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:04 crc kubenswrapper[4867]: I1201 09:08:04.893111 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:04 crc kubenswrapper[4867]: I1201 09:08:04.893066 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:05 crc kubenswrapper[4867]: I1201 09:08:05.892370 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:05 crc kubenswrapper[4867]: I1201 09:08:05.892373 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:05 crc kubenswrapper[4867]: I1201 09:08:05.894101 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:05 crc kubenswrapper[4867]: I1201 09:08:05.894126 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:05 crc kubenswrapper[4867]: I1201 09:08:05.894167 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:05 crc kubenswrapper[4867]: I1201 09:08:05.894180 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:05 crc kubenswrapper[4867]: I1201 09:08:05.894193 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:05 crc kubenswrapper[4867]: I1201 09:08:05.894192 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:06 crc kubenswrapper[4867]: I1201 09:08:06.970475 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:08:06 crc kubenswrapper[4867]: I1201 09:08:06.971143 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:06 crc kubenswrapper[4867]: I1201 09:08:06.972426 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:06 crc kubenswrapper[4867]: I1201 09:08:06.972460 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:06 crc kubenswrapper[4867]: I1201 09:08:06.972472 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:06 crc kubenswrapper[4867]: I1201 09:08:06.976968 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:08:07 crc kubenswrapper[4867]: I1201 09:08:07.574217 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 01 09:08:07 crc kubenswrapper[4867]: I1201 09:08:07.574406 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:07 crc kubenswrapper[4867]: I1201 09:08:07.575852 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:07 crc kubenswrapper[4867]: I1201 09:08:07.575947 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:07 crc kubenswrapper[4867]: I1201 09:08:07.576046 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:07 crc kubenswrapper[4867]: I1201 09:08:07.751274 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 01 09:08:07 crc kubenswrapper[4867]: I1201 09:08:07.896577 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:07 crc kubenswrapper[4867]: I1201 09:08:07.896622 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:07 crc kubenswrapper[4867]: I1201 09:08:07.898025 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:07 crc kubenswrapper[4867]: I1201 09:08:07.898157 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:07 crc kubenswrapper[4867]: I1201 09:08:07.898257 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:07 crc kubenswrapper[4867]: I1201 09:08:07.898090 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:07 crc kubenswrapper[4867]: I1201 09:08:07.898433 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:07 crc kubenswrapper[4867]: I1201 09:08:07.898463 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:08 crc kubenswrapper[4867]: I1201 09:08:08.123198 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:08:08 crc kubenswrapper[4867]: I1201 09:08:08.374030 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:08:08 crc kubenswrapper[4867]: I1201 09:08:08.374227 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:08 crc kubenswrapper[4867]: I1201 09:08:08.375343 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:08 crc kubenswrapper[4867]: I1201 09:08:08.375387 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:08 crc kubenswrapper[4867]: I1201 09:08:08.375401 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:08 crc kubenswrapper[4867]: I1201 09:08:08.860958 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:08:08 crc kubenswrapper[4867]: I1201 09:08:08.898617 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:08 crc kubenswrapper[4867]: I1201 09:08:08.899485 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:08 crc kubenswrapper[4867]: I1201 09:08:08.899523 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:08 crc kubenswrapper[4867]: I1201 09:08:08.899532 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:08 crc kubenswrapper[4867]: E1201 09:08:08.997421 4867 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 09:08:09 crc kubenswrapper[4867]: I1201 09:08:09.901250 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:09 crc kubenswrapper[4867]: I1201 09:08:09.902334 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:09 crc kubenswrapper[4867]: I1201 09:08:09.902390 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:09 crc kubenswrapper[4867]: I1201 09:08:09.902407 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:09 crc kubenswrapper[4867]: I1201 09:08:09.908281 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:08:10 crc kubenswrapper[4867]: I1201 09:08:10.903795 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:10 crc kubenswrapper[4867]: I1201 09:08:10.904583 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:10 crc kubenswrapper[4867]: I1201 09:08:10.904623 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:10 crc kubenswrapper[4867]: I1201 09:08:10.904637 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:11 crc kubenswrapper[4867]: I1201 09:08:11.124121 4867 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 09:08:11 crc kubenswrapper[4867]: I1201 09:08:11.124206 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:08:11 crc kubenswrapper[4867]: I1201 09:08:11.773227 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 01 09:08:11 crc kubenswrapper[4867]: E1201 09:08:11.782706 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 01 09:08:11 crc kubenswrapper[4867]: I1201 09:08:11.810800 4867 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 01 09:08:11 crc kubenswrapper[4867]: I1201 09:08:11.810908 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 09:08:11 crc kubenswrapper[4867]: I1201 09:08:11.819736 4867 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 01 09:08:11 crc kubenswrapper[4867]: I1201 09:08:11.819830 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 09:08:14 crc kubenswrapper[4867]: I1201 09:08:14.285191 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:08:14 crc kubenswrapper[4867]: I1201 09:08:14.285333 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:14 crc kubenswrapper[4867]: I1201 09:08:14.285770 4867 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 01 09:08:14 crc kubenswrapper[4867]: I1201 09:08:14.285876 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 01 09:08:14 crc kubenswrapper[4867]: I1201 09:08:14.286372 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:14 crc kubenswrapper[4867]: I1201 09:08:14.286406 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:14 crc kubenswrapper[4867]: I1201 09:08:14.286418 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:14 crc kubenswrapper[4867]: I1201 09:08:14.290582 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:08:14 crc kubenswrapper[4867]: I1201 09:08:14.913383 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:14 crc kubenswrapper[4867]: I1201 09:08:14.913795 4867 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 01 09:08:14 crc kubenswrapper[4867]: I1201 09:08:14.913880 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 01 09:08:14 crc kubenswrapper[4867]: I1201 09:08:14.914239 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:14 crc kubenswrapper[4867]: I1201 09:08:14.914288 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:14 crc kubenswrapper[4867]: I1201 09:08:14.914305 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:15 crc kubenswrapper[4867]: I1201 09:08:15.200754 4867 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 01 09:08:15 crc kubenswrapper[4867]: I1201 09:08:15.200878 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 01 09:08:16 crc kubenswrapper[4867]: I1201 09:08:16.808202 4867 trace.go:236] Trace[89653925]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 09:08:01.967) (total time: 14841ms): Dec 01 09:08:16 crc kubenswrapper[4867]: Trace[89653925]: ---"Objects listed" error: 14840ms (09:08:16.808) Dec 01 09:08:16 crc kubenswrapper[4867]: Trace[89653925]: [14.841015009s] [14.841015009s] END Dec 01 09:08:16 crc kubenswrapper[4867]: I1201 09:08:16.808237 4867 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 09:08:16 crc kubenswrapper[4867]: I1201 09:08:16.808511 4867 trace.go:236] Trace[668060147]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 09:08:02.147) (total time: 14660ms): Dec 01 09:08:16 crc kubenswrapper[4867]: Trace[668060147]: ---"Objects listed" error: 14660ms (09:08:16.808) Dec 01 09:08:16 crc kubenswrapper[4867]: Trace[668060147]: [14.660562152s] [14.660562152s] END Dec 01 09:08:16 crc kubenswrapper[4867]: I1201 09:08:16.808524 4867 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 09:08:16 crc kubenswrapper[4867]: I1201 09:08:16.810555 4867 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 01 09:08:16 crc kubenswrapper[4867]: E1201 09:08:16.810595 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 01 09:08:16 crc kubenswrapper[4867]: I1201 09:08:16.810837 4867 trace.go:236] Trace[1696844411]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 09:08:02.575) (total time: 14235ms): Dec 01 09:08:16 crc kubenswrapper[4867]: Trace[1696844411]: ---"Objects listed" error: 14235ms (09:08:16.810) Dec 01 09:08:16 crc kubenswrapper[4867]: Trace[1696844411]: [14.235344902s] [14.235344902s] END Dec 01 09:08:16 crc kubenswrapper[4867]: I1201 09:08:16.810875 4867 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 09:08:16 crc kubenswrapper[4867]: I1201 09:08:16.811332 4867 trace.go:236] Trace[2138935176]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 09:08:01.809) (total time: 15001ms): Dec 01 09:08:16 crc kubenswrapper[4867]: Trace[2138935176]: ---"Objects listed" error: 15001ms (09:08:16.811) Dec 01 09:08:16 crc kubenswrapper[4867]: Trace[2138935176]: [15.001360098s] [15.001360098s] END Dec 01 09:08:16 crc kubenswrapper[4867]: I1201 09:08:16.811350 4867 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.772678 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.773770 4867 apiserver.go:52] "Watching apiserver" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.776636 4867 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.776966 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-tdw66","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.777326 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.777361 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.777382 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.777426 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.777661 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:17 crc kubenswrapper[4867]: E1201 09:08:17.777715 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.777836 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:08:17 crc kubenswrapper[4867]: E1201 09:08:17.777835 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:17 crc kubenswrapper[4867]: E1201 09:08:17.777968 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.778065 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tdw66" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.779929 4867 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.781463 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.781705 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.783009 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.783982 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.784964 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.785078 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.785337 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.785482 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.786035 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.787654 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.787750 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.787757 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.796156 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.799543 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.810332 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.816635 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.816686 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.816714 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.816739 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.816759 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.816779 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.816800 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.816841 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.816863 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.816883 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.816905 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.816927 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.816949 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817008 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817033 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817076 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817098 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817123 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817148 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817176 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817180 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817200 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817249 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817271 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817295 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817316 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817337 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817359 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817382 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817401 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817425 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817448 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817468 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817490 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817511 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817532 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817590 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817615 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817637 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817663 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817724 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817750 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817772 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817795 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817825 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817877 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817902 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817924 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817948 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817996 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818022 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818045 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818067 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818092 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818116 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818141 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818166 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818188 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818213 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817266 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818235 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817371 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817478 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818265 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818290 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818316 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818341 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818363 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818386 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818411 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818432 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818454 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818476 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818497 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818519 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818543 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818570 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818604 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818627 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818651 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818677 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818700 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818726 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818749 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818773 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818797 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818827 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818869 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818893 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818914 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818937 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818962 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818985 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819009 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819032 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819057 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819080 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819127 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819155 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819178 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819203 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819253 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819279 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819304 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819329 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819352 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819375 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819398 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819423 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819445 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819468 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819491 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819514 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819537 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819560 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819581 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819604 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819626 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819652 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819677 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819699 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819723 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819745 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819874 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819906 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819930 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819956 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819981 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820006 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820030 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820056 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820081 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820106 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820130 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820153 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820176 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820202 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820227 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820251 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820276 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820300 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820325 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820351 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820376 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820406 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820434 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820458 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820482 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820510 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820534 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820557 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820581 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820605 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820632 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820657 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820680 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820706 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820744 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820770 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820794 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820823 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820867 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820894 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820920 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820946 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.837366 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.845744 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817508 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817755 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817917 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817932 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.817942 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818077 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818081 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818221 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818234 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818293 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818361 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818524 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818555 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818698 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818830 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.818946 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819289 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819440 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.819644 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820229 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820298 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820359 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820516 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820584 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820587 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820677 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820803 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820858 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820874 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: E1201 09:08:17.820965 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:08:18.320945027 +0000 UTC m=+19.780331781 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.820990 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.821209 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.821482 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.852557 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.824468 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.824751 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.824972 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.825007 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.825420 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.825476 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.825661 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.825723 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.825943 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.826023 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.826599 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.826613 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.826662 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.826704 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.826797 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.826832 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.826878 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.827039 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.827068 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.827147 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.827248 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.827273 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.827330 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.827833 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.827872 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.827889 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.828019 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.828055 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.828236 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.828255 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.828266 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.828400 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.828540 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.828608 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.828705 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.829031 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.829211 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.829627 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.829647 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.829687 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.830114 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.830316 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.830377 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.830417 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.830532 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.830756 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.831022 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.831180 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.831270 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.831723 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.831900 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.832352 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.853547 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.833533 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.834043 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.834116 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.834446 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.834728 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.834760 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.834801 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.834794 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.834909 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.835265 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.835453 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.835857 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.835899 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.836014 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.836063 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.836095 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.836356 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.837909 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.837975 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.838172 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.840378 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.841083 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.841310 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.841753 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.841918 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.842020 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.842038 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.842326 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.842414 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.844265 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.845098 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.845164 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.845250 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.845437 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.845822 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.846114 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.846652 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.847260 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.847743 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.848002 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.850332 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.850592 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.851219 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.851931 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.852222 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.857139 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.857532 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.857734 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.858038 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.858298 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.858556 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.858614 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.858646 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.858679 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.858712 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.858738 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.858765 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.858793 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.860876 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.860912 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.860936 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.860960 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.860983 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861005 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861025 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861047 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861069 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861089 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.858822 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861114 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861135 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861153 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861168 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861183 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861199 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861216 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861232 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861246 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861264 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861280 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861299 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861315 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861335 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861351 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861374 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861431 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861463 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861487 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861505 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861548 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7920004f-7b75-4925-8961-2629dc17ee30-hosts-file\") pod \"node-resolver-tdw66\" (UID: \"7920004f-7b75-4925-8961-2629dc17ee30\") " pod="openshift-dns/node-resolver-tdw66" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861573 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861590 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861605 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861624 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861646 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861666 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861687 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861710 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8hn7\" (UniqueName: \"kubernetes.io/projected/7920004f-7b75-4925-8961-2629dc17ee30-kube-api-access-z8hn7\") pod \"node-resolver-tdw66\" (UID: \"7920004f-7b75-4925-8961-2629dc17ee30\") " pod="openshift-dns/node-resolver-tdw66" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861731 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861749 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861814 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.858968 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.859413 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.859421 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.859609 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.860108 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.860437 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.860484 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.860618 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.860813 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.860984 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861087 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861085 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861202 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861338 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861407 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861489 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861664 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861921 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.861955 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.862407 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.862414 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.862619 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.863215 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.863224 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.863491 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.864153 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.864138 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.864345 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.864545 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.864607 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.864693 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.865304 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.865325 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.865572 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.865677 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.865893 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.866062 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.866078 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.866089 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: E1201 09:08:17.866162 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.866236 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.866245 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: E1201 09:08:17.866263 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:18.366241415 +0000 UTC m=+19.825628169 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.866616 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.866831 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.866878 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.867063 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.867138 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.867300 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: E1201 09:08:17.867352 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.867498 4867 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 01 09:08:17 crc kubenswrapper[4867]: E1201 09:08:17.867527 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:18.36750678 +0000 UTC m=+19.826893534 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.867597 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.867624 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.867747 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868021 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868046 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868060 4867 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868071 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868083 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868095 4867 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868106 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868118 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868132 4867 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868143 4867 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868152 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868160 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868170 4867 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868181 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868193 4867 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868206 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868217 4867 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868229 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868238 4867 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868249 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868258 4867 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868267 4867 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868279 4867 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870615 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870658 4867 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870670 4867 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870681 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870689 4867 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870698 4867 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870707 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870715 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870723 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870734 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870743 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870752 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870762 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870771 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870780 4867 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870789 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870799 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870875 4867 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870870 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870899 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870911 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870921 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870933 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870943 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870953 4867 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.868236 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.870965 4867 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871019 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871050 4867 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871061 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871071 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871082 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871094 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871107 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871122 4867 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871134 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871146 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871157 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871169 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871182 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871194 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871205 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871217 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871227 4867 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871237 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871247 4867 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871256 4867 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871265 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871274 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871284 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871294 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871306 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871317 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871328 4867 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871340 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871351 4867 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871361 4867 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871370 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871378 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871387 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871397 4867 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871406 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871414 4867 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871423 4867 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871431 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871440 4867 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871449 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871458 4867 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871466 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871474 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871483 4867 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871492 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871500 4867 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871507 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871515 4867 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871523 4867 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871531 4867 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871544 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871551 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871560 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871568 4867 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871576 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871584 4867 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871592 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871599 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871608 4867 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871617 4867 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871627 4867 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871637 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871651 4867 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871696 4867 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871705 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871715 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871727 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871739 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871751 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871762 4867 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871770 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871777 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871785 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871793 4867 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871800 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871811 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871820 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871829 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871852 4867 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871860 4867 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871869 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871877 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871886 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871894 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871902 4867 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871910 4867 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871918 4867 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871926 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871935 4867 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871944 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871953 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871962 4867 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871972 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.871981 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.874598 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.878318 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.878812 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:08:17 crc kubenswrapper[4867]: E1201 09:08:17.880589 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:08:17 crc kubenswrapper[4867]: E1201 09:08:17.880624 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:08:17 crc kubenswrapper[4867]: E1201 09:08:17.880639 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:17 crc kubenswrapper[4867]: E1201 09:08:17.880695 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:18.380677636 +0000 UTC m=+19.840064380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.890189 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.890305 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.890703 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.892087 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.892541 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.893010 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.902134 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.902810 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:17 crc kubenswrapper[4867]: E1201 09:08:17.905878 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:08:17 crc kubenswrapper[4867]: E1201 09:08:17.906130 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:08:17 crc kubenswrapper[4867]: E1201 09:08:17.906200 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:17 crc kubenswrapper[4867]: E1201 09:08:17.906407 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:18.406382683 +0000 UTC m=+19.865769437 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.911329 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.911939 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.919975 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.921478 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.923883 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169" exitCode=255 Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.924049 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169"} Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.931244 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.931517 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.932892 4867 scope.go:117] "RemoveContainer" containerID="c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.944179 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.952331 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.965299 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.972991 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.973276 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7920004f-7b75-4925-8961-2629dc17ee30-hosts-file\") pod \"node-resolver-tdw66\" (UID: \"7920004f-7b75-4925-8961-2629dc17ee30\") " pod="openshift-dns/node-resolver-tdw66" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.973408 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.973492 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8hn7\" (UniqueName: \"kubernetes.io/projected/7920004f-7b75-4925-8961-2629dc17ee30-kube-api-access-z8hn7\") pod \"node-resolver-tdw66\" (UID: \"7920004f-7b75-4925-8961-2629dc17ee30\") " pod="openshift-dns/node-resolver-tdw66" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.973616 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.973667 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.973341 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7920004f-7b75-4925-8961-2629dc17ee30-hosts-file\") pod \"node-resolver-tdw66\" (UID: \"7920004f-7b75-4925-8961-2629dc17ee30\") " pod="openshift-dns/node-resolver-tdw66" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.973679 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.973851 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.973938 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.974011 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.974075 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.974126 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.974183 4867 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.974250 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.974321 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.974394 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.974460 4867 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.974513 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.974565 4867 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.974629 4867 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.974680 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.974734 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.974793 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.974866 4867 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.974918 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.974976 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.975026 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.975075 4867 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.975124 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.975189 4867 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.975245 4867 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.975304 4867 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.975358 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.975409 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.975464 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.975533 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.975590 4867 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.975645 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.975728 4867 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.975781 4867 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.975830 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.975913 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.975970 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.976022 4867 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.976072 4867 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.976130 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.976209 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.976290 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.976351 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.976404 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.976458 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.976508 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.976562 4867 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.976618 4867 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.976678 4867 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.976732 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.976781 4867 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.976868 4867 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.974168 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.973135 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.985097 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:17 crc kubenswrapper[4867]: I1201 09:08:17.989368 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8hn7\" (UniqueName: \"kubernetes.io/projected/7920004f-7b75-4925-8961-2629dc17ee30-kube-api-access-z8hn7\") pod \"node-resolver-tdw66\" (UID: \"7920004f-7b75-4925-8961-2629dc17ee30\") " pod="openshift-dns/node-resolver-tdw66" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.000600 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.017916 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.027579 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.091687 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.101437 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:08:18 crc kubenswrapper[4867]: W1201 09:08:18.102982 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-e28bcdd838bb9dad03cdaecc4507a2356423b8a744906cae13ea7396eca2916e WatchSource:0}: Error finding container e28bcdd838bb9dad03cdaecc4507a2356423b8a744906cae13ea7396eca2916e: Status 404 returned error can't find the container with id e28bcdd838bb9dad03cdaecc4507a2356423b8a744906cae13ea7396eca2916e Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.109063 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:08:18 crc kubenswrapper[4867]: W1201 09:08:18.109987 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-413f4e9e15f6692c1a6bd4c67a7cc30596781a3bfa6b26024333df3ba927dd99 WatchSource:0}: Error finding container 413f4e9e15f6692c1a6bd4c67a7cc30596781a3bfa6b26024333df3ba927dd99: Status 404 returned error can't find the container with id 413f4e9e15f6692c1a6bd4c67a7cc30596781a3bfa6b26024333df3ba927dd99 Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.115063 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tdw66" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.130188 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.135582 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.139379 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.153283 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.173085 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.184082 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.194993 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.206212 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.220153 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.243756 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.259579 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.304424 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.317640 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.342256 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.353898 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.374750 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.385003 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.385059 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.385088 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.385105 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:18 crc kubenswrapper[4867]: E1201 09:08:18.385203 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:08:18 crc kubenswrapper[4867]: E1201 09:08:18.385217 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:08:18 crc kubenswrapper[4867]: E1201 09:08:18.385226 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:18 crc kubenswrapper[4867]: E1201 09:08:18.385264 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:19.385252403 +0000 UTC m=+20.844639157 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:18 crc kubenswrapper[4867]: E1201 09:08:18.385309 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:08:19.385303844 +0000 UTC m=+20.844690598 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:08:18 crc kubenswrapper[4867]: E1201 09:08:18.385347 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:08:18 crc kubenswrapper[4867]: E1201 09:08:18.385365 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:19.385359985 +0000 UTC m=+20.844746739 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:08:18 crc kubenswrapper[4867]: E1201 09:08:18.385388 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:08:18 crc kubenswrapper[4867]: E1201 09:08:18.385406 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:19.385401197 +0000 UTC m=+20.844787951 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.390594 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.408635 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.423729 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.435669 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.452693 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.481994 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.485803 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:18 crc kubenswrapper[4867]: E1201 09:08:18.485931 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:08:18 crc kubenswrapper[4867]: E1201 09:08:18.486146 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:08:18 crc kubenswrapper[4867]: E1201 09:08:18.486223 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:18 crc kubenswrapper[4867]: E1201 09:08:18.486370 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:19.486349953 +0000 UTC m=+20.945736707 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.584226 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-mt9t2"] Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.584628 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.586449 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cd237749-4cea-4ff6-a374-8da70f9c879a-rootfs\") pod \"machine-config-daemon-mt9t2\" (UID: \"cd237749-4cea-4ff6-a374-8da70f9c879a\") " pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.586474 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd237749-4cea-4ff6-a374-8da70f9c879a-proxy-tls\") pod \"machine-config-daemon-mt9t2\" (UID: \"cd237749-4cea-4ff6-a374-8da70f9c879a\") " pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.586489 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnb2h\" (UniqueName: \"kubernetes.io/projected/cd237749-4cea-4ff6-a374-8da70f9c879a-kube-api-access-gnb2h\") pod \"machine-config-daemon-mt9t2\" (UID: \"cd237749-4cea-4ff6-a374-8da70f9c879a\") " pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.586703 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cd237749-4cea-4ff6-a374-8da70f9c879a-mcd-auth-proxy-config\") pod \"machine-config-daemon-mt9t2\" (UID: \"cd237749-4cea-4ff6-a374-8da70f9c879a\") " pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.595232 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.595661 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.596049 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.596299 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.598795 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.611761 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.632889 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.647367 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.657764 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.665442 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.674079 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.687152 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cd237749-4cea-4ff6-a374-8da70f9c879a-rootfs\") pod \"machine-config-daemon-mt9t2\" (UID: \"cd237749-4cea-4ff6-a374-8da70f9c879a\") " pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.687192 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd237749-4cea-4ff6-a374-8da70f9c879a-proxy-tls\") pod \"machine-config-daemon-mt9t2\" (UID: \"cd237749-4cea-4ff6-a374-8da70f9c879a\") " pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.687215 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnb2h\" (UniqueName: \"kubernetes.io/projected/cd237749-4cea-4ff6-a374-8da70f9c879a-kube-api-access-gnb2h\") pod \"machine-config-daemon-mt9t2\" (UID: \"cd237749-4cea-4ff6-a374-8da70f9c879a\") " pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.687249 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cd237749-4cea-4ff6-a374-8da70f9c879a-rootfs\") pod \"machine-config-daemon-mt9t2\" (UID: \"cd237749-4cea-4ff6-a374-8da70f9c879a\") " pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.687264 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cd237749-4cea-4ff6-a374-8da70f9c879a-mcd-auth-proxy-config\") pod \"machine-config-daemon-mt9t2\" (UID: \"cd237749-4cea-4ff6-a374-8da70f9c879a\") " pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.687943 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cd237749-4cea-4ff6-a374-8da70f9c879a-mcd-auth-proxy-config\") pod \"machine-config-daemon-mt9t2\" (UID: \"cd237749-4cea-4ff6-a374-8da70f9c879a\") " pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.691358 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd237749-4cea-4ff6-a374-8da70f9c879a-proxy-tls\") pod \"machine-config-daemon-mt9t2\" (UID: \"cd237749-4cea-4ff6-a374-8da70f9c879a\") " pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.693165 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.704493 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.706500 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnb2h\" (UniqueName: \"kubernetes.io/projected/cd237749-4cea-4ff6-a374-8da70f9c879a-kube-api-access-gnb2h\") pod \"machine-config-daemon-mt9t2\" (UID: \"cd237749-4cea-4ff6-a374-8da70f9c879a\") " pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.715124 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.724283 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.734273 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.830247 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.830814 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.832165 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.832902 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.834151 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.834774 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.835498 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.836037 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.836613 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.837222 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.838236 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.838897 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.840003 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.840485 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.841143 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.842231 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.842720 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.843651 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.844083 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.844663 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.845659 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.846241 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.847237 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.847664 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.848802 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.849281 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.849985 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.851158 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.851760 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.852315 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.853635 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.854302 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.855325 4867 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.855454 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.857621 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.858750 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.859280 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.861349 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.862468 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.863517 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.863878 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.864986 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.866135 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.866822 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.868021 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.868738 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.870075 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.870594 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.871620 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.872404 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.873871 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.874303 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.874463 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.875493 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.876083 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.877535 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.878234 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.878799 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.885402 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.897557 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.902180 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:18Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.931693 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.932291 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:18Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.933188 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71"} Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.933638 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.934009 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"67eb12c3be4873d809905cab1e3ff5175a7d22d7bbe2f081262708672349f121"} Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.936124 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tdw66" event={"ID":"7920004f-7b75-4925-8961-2629dc17ee30","Type":"ContainerStarted","Data":"510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708"} Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.936157 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tdw66" event={"ID":"7920004f-7b75-4925-8961-2629dc17ee30","Type":"ContainerStarted","Data":"d1994101794967844f6c825246fabf5d49f131991890166c086ad29d13488d71"} Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.937021 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6ea747409cdbf70306487c0e6bf2e6933ed3ae9787f3250398cd1a1aff9fe788"} Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.938010 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1"} Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.938030 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df"} Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.938040 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"413f4e9e15f6692c1a6bd4c67a7cc30596781a3bfa6b26024333df3ba927dd99"} Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.939625 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0"} Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.939658 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e28bcdd838bb9dad03cdaecc4507a2356423b8a744906cae13ea7396eca2916e"} Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.947706 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:18Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.965876 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:18Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.995228 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:18Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.998792 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-tj9fl"] Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.999111 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-g6dw4"] Dec 01 09:08:18 crc kubenswrapper[4867]: I1201 09:08:18.999746 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.000191 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.008301 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.008467 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.008600 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.009015 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.009440 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.011057 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.012400 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.019803 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.035523 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.052079 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.068838 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.087778 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.090911 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6494ebd3-57c2-4d65-b44a-3e30e76910a9-os-release\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.090966 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-cnibin\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.090984 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-host-run-k8s-cni-cncf-io\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.090998 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-host-var-lib-cni-bin\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.091014 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-hostroot\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.091038 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-system-cni-dir\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.091052 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6494ebd3-57c2-4d65-b44a-3e30e76910a9-cni-binary-copy\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.091077 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-cni-binary-copy\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.091091 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6494ebd3-57c2-4d65-b44a-3e30e76910a9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.091104 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-host-run-netns\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.091119 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-host-var-lib-cni-multus\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.091134 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-host-var-lib-kubelet\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.091148 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-multus-daemon-config\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.091164 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2kcp\" (UniqueName: \"kubernetes.io/projected/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-kube-api-access-k2kcp\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.091177 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddfxm\" (UniqueName: \"kubernetes.io/projected/6494ebd3-57c2-4d65-b44a-3e30e76910a9-kube-api-access-ddfxm\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.091198 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-os-release\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.091212 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-host-run-multus-certs\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.091225 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6494ebd3-57c2-4d65-b44a-3e30e76910a9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.091247 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6494ebd3-57c2-4d65-b44a-3e30e76910a9-cnibin\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.091268 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-multus-socket-dir-parent\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.091281 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6494ebd3-57c2-4d65-b44a-3e30e76910a9-system-cni-dir\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.091296 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-multus-conf-dir\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.091311 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-etc-kubernetes\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.091326 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-multus-cni-dir\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.100937 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.134367 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.165369 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192496 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-multus-cni-dir\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192537 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6494ebd3-57c2-4d65-b44a-3e30e76910a9-os-release\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192552 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-host-var-lib-cni-bin\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192566 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-hostroot\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192586 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-cnibin\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192600 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-host-run-k8s-cni-cncf-io\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192638 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-system-cni-dir\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192652 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6494ebd3-57c2-4d65-b44a-3e30e76910a9-cni-binary-copy\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192675 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-cni-binary-copy\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192692 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6494ebd3-57c2-4d65-b44a-3e30e76910a9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192707 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-host-run-netns\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192722 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-host-var-lib-cni-multus\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192735 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-host-var-lib-kubelet\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192748 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-multus-daemon-config\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192763 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2kcp\" (UniqueName: \"kubernetes.io/projected/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-kube-api-access-k2kcp\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192776 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddfxm\" (UniqueName: \"kubernetes.io/projected/6494ebd3-57c2-4d65-b44a-3e30e76910a9-kube-api-access-ddfxm\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192797 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-os-release\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192814 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-host-run-multus-certs\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192843 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6494ebd3-57c2-4d65-b44a-3e30e76910a9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192865 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6494ebd3-57c2-4d65-b44a-3e30e76910a9-cnibin\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192877 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-multus-socket-dir-parent\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192892 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6494ebd3-57c2-4d65-b44a-3e30e76910a9-system-cni-dir\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192906 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-multus-conf-dir\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192919 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-etc-kubernetes\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.192967 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-etc-kubernetes\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.193108 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-multus-cni-dir\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.193350 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6494ebd3-57c2-4d65-b44a-3e30e76910a9-os-release\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.193397 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-host-var-lib-kubelet\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.193422 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-hostroot\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.193414 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-host-var-lib-cni-bin\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.193490 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-system-cni-dir\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.193467 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-host-run-k8s-cni-cncf-io\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.194095 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6494ebd3-57c2-4d65-b44a-3e30e76910a9-cni-binary-copy\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.194545 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-cni-binary-copy\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.194746 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6494ebd3-57c2-4d65-b44a-3e30e76910a9-cnibin\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.193452 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-cnibin\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.194829 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-multus-conf-dir\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.194881 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-host-run-multus-certs\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.194844 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-multus-socket-dir-parent\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.194859 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-host-run-netns\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.194934 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6494ebd3-57c2-4d65-b44a-3e30e76910a9-system-cni-dir\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.194955 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-host-var-lib-cni-multus\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.195013 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6494ebd3-57c2-4d65-b44a-3e30e76910a9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.195131 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-os-release\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.195206 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-multus-daemon-config\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.195688 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6494ebd3-57c2-4d65-b44a-3e30e76910a9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.204299 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.286593 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.303208 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddfxm\" (UniqueName: \"kubernetes.io/projected/6494ebd3-57c2-4d65-b44a-3e30e76910a9-kube-api-access-ddfxm\") pod \"multus-additional-cni-plugins-g6dw4\" (UID: \"6494ebd3-57c2-4d65-b44a-3e30e76910a9\") " pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.303410 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2kcp\" (UniqueName: \"kubernetes.io/projected/c813b7ba-4c04-44d0-9f3e-3e5f4897fb73-kube-api-access-k2kcp\") pod \"multus-tj9fl\" (UID: \"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\") " pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.328364 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.334799 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.343307 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tj9fl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.364148 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: W1201 09:08:19.366555 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc813b7ba_4c04_44d0_9f3e_3e5f4897fb73.slice/crio-44682511c30780d03bd427ec677290b6f95dd045cbf332017727b9584b744089 WatchSource:0}: Error finding container 44682511c30780d03bd427ec677290b6f95dd045cbf332017727b9584b744089: Status 404 returned error can't find the container with id 44682511c30780d03bd427ec677290b6f95dd045cbf332017727b9584b744089 Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.395803 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.396104 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.396207 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.396305 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:19 crc kubenswrapper[4867]: E1201 09:08:19.396463 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:08:19 crc kubenswrapper[4867]: E1201 09:08:19.396571 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:21.396558893 +0000 UTC m=+22.855945647 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:08:19 crc kubenswrapper[4867]: E1201 09:08:19.396973 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:08:21.396964213 +0000 UTC m=+22.856350967 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:08:19 crc kubenswrapper[4867]: E1201 09:08:19.397092 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:08:19 crc kubenswrapper[4867]: E1201 09:08:19.397171 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:21.397150548 +0000 UTC m=+22.856537352 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:08:19 crc kubenswrapper[4867]: E1201 09:08:19.397249 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:08:19 crc kubenswrapper[4867]: E1201 09:08:19.397333 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:08:19 crc kubenswrapper[4867]: E1201 09:08:19.397414 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:19 crc kubenswrapper[4867]: E1201 09:08:19.397503 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:21.397495108 +0000 UTC m=+22.856881862 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.398334 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kk2hn"] Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.399429 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.410643 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.415291 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.435140 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.457969 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.475197 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.495076 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.500234 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-run-ovn\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.500294 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-node-log\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.500323 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.500349 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4925\" (UniqueName: \"kubernetes.io/projected/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-kube-api-access-g4925\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.500377 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-slash\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.500402 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-run-openvswitch\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.500422 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-kubelet\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.500446 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-log-socket\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.500494 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.500529 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-ovn-node-metrics-cert\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.500565 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-cni-bin\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.500587 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-env-overrides\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.500825 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-run-ovn-kubernetes\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.500938 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-cni-netd\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.500963 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-ovnkube-script-lib\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.500985 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-systemd-units\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.501002 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-var-lib-openvswitch\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.501034 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-run-systemd\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.501051 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-run-netns\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.501071 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-etc-openvswitch\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.501094 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-ovnkube-config\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: E1201 09:08:19.501327 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:08:19 crc kubenswrapper[4867]: E1201 09:08:19.501353 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:08:19 crc kubenswrapper[4867]: E1201 09:08:19.501372 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:19 crc kubenswrapper[4867]: E1201 09:08:19.501427 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:21.501404994 +0000 UTC m=+22.960791748 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.515304 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.535300 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.591785 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602225 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-cni-bin\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602259 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-env-overrides\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602285 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-run-ovn-kubernetes\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602303 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-cni-netd\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602321 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-ovnkube-script-lib\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602339 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-systemd-units\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602355 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-var-lib-openvswitch\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602379 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-run-systemd\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602398 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-run-netns\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602413 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-etc-openvswitch\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602428 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-ovnkube-config\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602449 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-run-ovn\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602467 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-node-log\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602483 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602500 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4925\" (UniqueName: \"kubernetes.io/projected/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-kube-api-access-g4925\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602516 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-slash\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602511 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-cni-netd\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602578 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-run-openvswitch\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602536 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-run-openvswitch\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602649 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-kubelet\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602674 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-log-socket\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.602726 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-ovn-node-metrics-cert\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.603090 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-env-overrides\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.603177 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-run-systemd\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.603215 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-systemd-units\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.603249 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-var-lib-openvswitch\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.603257 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-ovnkube-script-lib\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.603289 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.603297 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-run-ovn\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.603429 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-log-socket\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.603446 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-node-log\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.603461 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-run-ovn-kubernetes\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.603459 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-kubelet\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.603475 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-slash\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.603483 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-etc-openvswitch\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.603496 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-run-netns\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.603561 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-cni-bin\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.603576 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-ovnkube-config\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.607373 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-ovn-node-metrics-cert\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.633772 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.659489 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4925\" (UniqueName: \"kubernetes.io/projected/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-kube-api-access-g4925\") pod \"ovnkube-node-kk2hn\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.684951 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.711552 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:19 crc kubenswrapper[4867]: W1201 09:08:19.727972 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f21a2a8_1fd5_4a00_a8ac_02c1f24a3f32.slice/crio-17bda87c2e237e71656257748c4d985a9fa6c1fe6585e62964de5ff46ae13308 WatchSource:0}: Error finding container 17bda87c2e237e71656257748c4d985a9fa6c1fe6585e62964de5ff46ae13308: Status 404 returned error can't find the container with id 17bda87c2e237e71656257748c4d985a9fa6c1fe6585e62964de5ff46ae13308 Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.737493 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.785374 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.823886 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.826412 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:19 crc kubenswrapper[4867]: E1201 09:08:19.826513 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.826572 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:19 crc kubenswrapper[4867]: E1201 09:08:19.826612 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.826866 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:19 crc kubenswrapper[4867]: E1201 09:08:19.826918 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.846982 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.889442 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.923488 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.942661 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerStarted","Data":"17bda87c2e237e71656257748c4d985a9fa6c1fe6585e62964de5ff46ae13308"} Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.943704 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" event={"ID":"6494ebd3-57c2-4d65-b44a-3e30e76910a9","Type":"ContainerStarted","Data":"a4fb4fa0feafead2c2e9f6f22fc1336209b37a679f3c11a9f446aa5e30d55316"} Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.945196 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893"} Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.945312 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be"} Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.946225 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tj9fl" event={"ID":"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73","Type":"ContainerStarted","Data":"7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1"} Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.946278 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tj9fl" event={"ID":"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73","Type":"ContainerStarted","Data":"44682511c30780d03bd427ec677290b6f95dd045cbf332017727b9584b744089"} Dec 01 09:08:19 crc kubenswrapper[4867]: I1201 09:08:19.963712 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.010870 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.010998 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.012280 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.012407 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.012483 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.012667 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.077368 4867 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.077589 4867 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.077585 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.078659 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.078691 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.078742 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.078758 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.078785 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:20Z","lastTransitionTime":"2025-12-01T09:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:20 crc kubenswrapper[4867]: E1201 09:08:20.093754 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.100164 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.100222 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.100235 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.100257 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.100272 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:20Z","lastTransitionTime":"2025-12-01T09:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:20 crc kubenswrapper[4867]: E1201 09:08:20.117398 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.121262 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.121301 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.121313 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.121328 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.121339 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:20Z","lastTransitionTime":"2025-12-01T09:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.130943 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: E1201 09:08:20.133524 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.137731 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.137974 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.137982 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.137994 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.138003 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:20Z","lastTransitionTime":"2025-12-01T09:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:20 crc kubenswrapper[4867]: E1201 09:08:20.149779 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.157196 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.157229 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.157239 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.157253 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.157263 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:20Z","lastTransitionTime":"2025-12-01T09:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.178191 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: E1201 09:08:20.179906 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: E1201 09:08:20.180026 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.182042 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.182071 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.182083 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.182099 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.182109 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:20Z","lastTransitionTime":"2025-12-01T09:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.208483 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.245867 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.284785 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.284844 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.284857 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.284874 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.284885 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:20Z","lastTransitionTime":"2025-12-01T09:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.297549 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.323445 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.366235 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.386617 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.386647 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.386655 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.386670 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.386678 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:20Z","lastTransitionTime":"2025-12-01T09:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.405553 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.448813 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.483382 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.489005 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.489041 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.489050 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.489065 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.489073 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:20Z","lastTransitionTime":"2025-12-01T09:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.523419 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.567988 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.591994 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.592032 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.592043 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.592058 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.592070 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:20Z","lastTransitionTime":"2025-12-01T09:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.607198 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.643438 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.684037 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.694761 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.694811 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.694842 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.694866 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.694878 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:20Z","lastTransitionTime":"2025-12-01T09:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.724112 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.789969 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.798268 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.798316 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.798330 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.798350 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.798365 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:20Z","lastTransitionTime":"2025-12-01T09:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.901106 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.901141 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.901150 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.901164 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.901173 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:20Z","lastTransitionTime":"2025-12-01T09:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.951654 4867 generic.go:334] "Generic (PLEG): container finished" podID="6494ebd3-57c2-4d65-b44a-3e30e76910a9" containerID="10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb" exitCode=0 Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.951753 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" event={"ID":"6494ebd3-57c2-4d65-b44a-3e30e76910a9","Type":"ContainerDied","Data":"10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb"} Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.953906 4867 generic.go:334] "Generic (PLEG): container finished" podID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerID="44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c" exitCode=0 Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.954055 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerDied","Data":"44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c"} Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.962736 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647"} Dec 01 09:08:20 crc kubenswrapper[4867]: I1201 09:08:20.983159 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.000153 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.004222 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.005049 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.005064 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.005084 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.005098 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:21Z","lastTransitionTime":"2025-12-01T09:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.016958 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.036328 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.049913 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.064463 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.075451 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.088587 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.109479 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.109743 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.109757 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.109775 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.109788 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:21Z","lastTransitionTime":"2025-12-01T09:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.129021 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.163451 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.204987 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.211905 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.211953 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.211972 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.211987 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.211999 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:21Z","lastTransitionTime":"2025-12-01T09:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.243572 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.301484 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.314105 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.314149 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.314158 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.314171 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.314181 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:21Z","lastTransitionTime":"2025-12-01T09:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.324887 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.372221 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.405039 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.415785 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.416107 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.416200 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.416284 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.416359 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:21Z","lastTransitionTime":"2025-12-01T09:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.418080 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:08:21 crc kubenswrapper[4867]: E1201 09:08:21.418244 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:08:25.418217566 +0000 UTC m=+26.877604340 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.418337 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.418375 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.418402 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:21 crc kubenswrapper[4867]: E1201 09:08:21.418482 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:08:21 crc kubenswrapper[4867]: E1201 09:08:21.418523 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:25.418514894 +0000 UTC m=+26.877901738 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:08:21 crc kubenswrapper[4867]: E1201 09:08:21.418645 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:08:21 crc kubenswrapper[4867]: E1201 09:08:21.418715 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:25.418699919 +0000 UTC m=+26.878086673 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:08:21 crc kubenswrapper[4867]: E1201 09:08:21.418891 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:08:21 crc kubenswrapper[4867]: E1201 09:08:21.418993 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:08:21 crc kubenswrapper[4867]: E1201 09:08:21.419068 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:21 crc kubenswrapper[4867]: E1201 09:08:21.419190 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:25.419173283 +0000 UTC m=+26.878560027 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.443779 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.494003 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.519513 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.519561 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.519579 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.519595 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.519605 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:21Z","lastTransitionTime":"2025-12-01T09:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.519780 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:21 crc kubenswrapper[4867]: E1201 09:08:21.519963 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:08:21 crc kubenswrapper[4867]: E1201 09:08:21.519993 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:08:21 crc kubenswrapper[4867]: E1201 09:08:21.520006 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:21 crc kubenswrapper[4867]: E1201 09:08:21.520058 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:25.520043007 +0000 UTC m=+26.979429761 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.526312 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.565749 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.605791 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.622206 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.622239 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.622248 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.622263 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.622274 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:21Z","lastTransitionTime":"2025-12-01T09:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.644207 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.684647 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.724332 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.724383 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.724399 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.724424 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.724440 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:21Z","lastTransitionTime":"2025-12-01T09:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.725997 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.767925 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.810962 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.825948 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.826033 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.826507 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:21 crc kubenswrapper[4867]: E1201 09:08:21.826726 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:21 crc kubenswrapper[4867]: E1201 09:08:21.827009 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.827148 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.827193 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.827217 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.827246 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.827268 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:21Z","lastTransitionTime":"2025-12-01T09:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:21 crc kubenswrapper[4867]: E1201 09:08:21.827726 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.855338 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.890413 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.929962 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.930034 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.930054 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.930077 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.930095 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:21Z","lastTransitionTime":"2025-12-01T09:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.972034 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerStarted","Data":"22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe"} Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.972469 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerStarted","Data":"e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe"} Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.972781 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerStarted","Data":"669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd"} Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.973027 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerStarted","Data":"185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3"} Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.973275 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerStarted","Data":"4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e"} Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.973504 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerStarted","Data":"c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6"} Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.974592 4867 generic.go:334] "Generic (PLEG): container finished" podID="6494ebd3-57c2-4d65-b44a-3e30e76910a9" containerID="14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a" exitCode=0 Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.974640 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" event={"ID":"6494ebd3-57c2-4d65-b44a-3e30e76910a9","Type":"ContainerDied","Data":"14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a"} Dec 01 09:08:21 crc kubenswrapper[4867]: I1201 09:08:21.996990 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.017940 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.034252 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.034307 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.034324 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.034350 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.034367 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:22Z","lastTransitionTime":"2025-12-01T09:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.040527 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-9j4ch"] Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.041268 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9j4ch" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.049726 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.050044 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.055591 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.062070 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.076536 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.127176 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.127333 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtgxd\" (UniqueName: \"kubernetes.io/projected/e0da9082-ce5b-48ef-ad08-d3f3c75ea937-kube-api-access-qtgxd\") pod \"node-ca-9j4ch\" (UID: \"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\") " pod="openshift-image-registry/node-ca-9j4ch" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.127392 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e0da9082-ce5b-48ef-ad08-d3f3c75ea937-serviceca\") pod \"node-ca-9j4ch\" (UID: \"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\") " pod="openshift-image-registry/node-ca-9j4ch" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.127413 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e0da9082-ce5b-48ef-ad08-d3f3c75ea937-host\") pod \"node-ca-9j4ch\" (UID: \"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\") " pod="openshift-image-registry/node-ca-9j4ch" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.137002 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.137035 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.137045 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.137059 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.137070 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:22Z","lastTransitionTime":"2025-12-01T09:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.165337 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.205279 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.228784 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtgxd\" (UniqueName: \"kubernetes.io/projected/e0da9082-ce5b-48ef-ad08-d3f3c75ea937-kube-api-access-qtgxd\") pod \"node-ca-9j4ch\" (UID: \"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\") " pod="openshift-image-registry/node-ca-9j4ch" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.228866 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e0da9082-ce5b-48ef-ad08-d3f3c75ea937-serviceca\") pod \"node-ca-9j4ch\" (UID: \"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\") " pod="openshift-image-registry/node-ca-9j4ch" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.228891 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e0da9082-ce5b-48ef-ad08-d3f3c75ea937-host\") pod \"node-ca-9j4ch\" (UID: \"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\") " pod="openshift-image-registry/node-ca-9j4ch" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.228941 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e0da9082-ce5b-48ef-ad08-d3f3c75ea937-host\") pod \"node-ca-9j4ch\" (UID: \"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\") " pod="openshift-image-registry/node-ca-9j4ch" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.229720 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e0da9082-ce5b-48ef-ad08-d3f3c75ea937-serviceca\") pod \"node-ca-9j4ch\" (UID: \"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\") " pod="openshift-image-registry/node-ca-9j4ch" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.239672 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.239712 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.239722 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.239738 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.239750 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:22Z","lastTransitionTime":"2025-12-01T09:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.250100 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.271187 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtgxd\" (UniqueName: \"kubernetes.io/projected/e0da9082-ce5b-48ef-ad08-d3f3c75ea937-kube-api-access-qtgxd\") pod \"node-ca-9j4ch\" (UID: \"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\") " pod="openshift-image-registry/node-ca-9j4ch" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.304244 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.344931 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.344989 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.345000 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.345014 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.345027 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:22Z","lastTransitionTime":"2025-12-01T09:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.354375 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.363473 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9j4ch" Dec 01 09:08:22 crc kubenswrapper[4867]: W1201 09:08:22.381437 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0da9082_ce5b_48ef_ad08_d3f3c75ea937.slice/crio-bca8b45554008e3b6134f3d5658bd50f0442201660c0c3cfb0aa8a836c0012ba WatchSource:0}: Error finding container bca8b45554008e3b6134f3d5658bd50f0442201660c0c3cfb0aa8a836c0012ba: Status 404 returned error can't find the container with id bca8b45554008e3b6134f3d5658bd50f0442201660c0c3cfb0aa8a836c0012ba Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.384578 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.428221 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.447421 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.447458 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.447468 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.447484 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.447495 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:22Z","lastTransitionTime":"2025-12-01T09:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.463035 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.505211 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.540767 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.550842 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.550930 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.550942 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.550957 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.550968 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:22Z","lastTransitionTime":"2025-12-01T09:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.587927 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.623583 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.653584 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.653614 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.653623 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.653635 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.653644 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:22Z","lastTransitionTime":"2025-12-01T09:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.662994 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.704350 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.747221 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.757264 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.757321 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.757337 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.757357 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.757373 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:22Z","lastTransitionTime":"2025-12-01T09:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.788891 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.824518 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.859974 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.860019 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.860032 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.860051 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.860062 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:22Z","lastTransitionTime":"2025-12-01T09:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.864230 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.912886 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.948149 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.962744 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.962803 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.962847 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.962873 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.962896 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:22Z","lastTransitionTime":"2025-12-01T09:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.980026 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9j4ch" event={"ID":"e0da9082-ce5b-48ef-ad08-d3f3c75ea937","Type":"ContainerStarted","Data":"e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17"} Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.980110 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9j4ch" event={"ID":"e0da9082-ce5b-48ef-ad08-d3f3c75ea937","Type":"ContainerStarted","Data":"bca8b45554008e3b6134f3d5658bd50f0442201660c0c3cfb0aa8a836c0012ba"} Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.983417 4867 generic.go:334] "Generic (PLEG): container finished" podID="6494ebd3-57c2-4d65-b44a-3e30e76910a9" containerID="c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d" exitCode=0 Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.983515 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" event={"ID":"6494ebd3-57c2-4d65-b44a-3e30e76910a9","Type":"ContainerDied","Data":"c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d"} Dec 01 09:08:22 crc kubenswrapper[4867]: I1201 09:08:22.989773 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.023455 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.065849 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.065901 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.065914 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.065928 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.065936 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:23Z","lastTransitionTime":"2025-12-01T09:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.067011 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.104124 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.147302 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.168884 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.168985 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.169007 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.169078 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.169095 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:23Z","lastTransitionTime":"2025-12-01T09:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.192895 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.225530 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.265111 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.272369 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.272415 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.272428 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.272446 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.272458 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:23Z","lastTransitionTime":"2025-12-01T09:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.307545 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.346157 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.374636 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.374668 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.374675 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.374687 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.374697 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:23Z","lastTransitionTime":"2025-12-01T09:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.387766 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.424909 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.464869 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.477912 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.477979 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.478004 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.478033 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.478055 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:23Z","lastTransitionTime":"2025-12-01T09:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.503643 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.552475 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.580487 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.580522 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.580531 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.580543 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.580553 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:23Z","lastTransitionTime":"2025-12-01T09:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.585611 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.621565 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.664036 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.682624 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.682675 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.682686 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.682704 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.682717 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:23Z","lastTransitionTime":"2025-12-01T09:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.703030 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.741457 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.785605 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.785642 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.785650 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.785663 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.785673 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:23Z","lastTransitionTime":"2025-12-01T09:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.826338 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.826402 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:23 crc kubenswrapper[4867]: E1201 09:08:23.826464 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.826338 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:23 crc kubenswrapper[4867]: E1201 09:08:23.826566 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:23 crc kubenswrapper[4867]: E1201 09:08:23.826663 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.887384 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.887447 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.887460 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.887476 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.887511 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:23Z","lastTransitionTime":"2025-12-01T09:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.989014 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.989054 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.989066 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.989083 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.989096 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:23Z","lastTransitionTime":"2025-12-01T09:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.991156 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerStarted","Data":"1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2"} Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.993719 4867 generic.go:334] "Generic (PLEG): container finished" podID="6494ebd3-57c2-4d65-b44a-3e30e76910a9" containerID="9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7" exitCode=0 Dec 01 09:08:23 crc kubenswrapper[4867]: I1201 09:08:23.993779 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" event={"ID":"6494ebd3-57c2-4d65-b44a-3e30e76910a9","Type":"ContainerDied","Data":"9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7"} Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.011140 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:24Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.024560 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:24Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.037348 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:24Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.047846 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:24Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.057984 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:24Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.069736 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:24Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.086432 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:24Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.091440 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.091484 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.091497 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.091517 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.091529 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:24Z","lastTransitionTime":"2025-12-01T09:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.100801 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:24Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.163358 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:24Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.189994 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:24Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.197280 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.197340 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.197354 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.197367 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.197379 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:24Z","lastTransitionTime":"2025-12-01T09:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.260197 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:24Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.303538 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:24Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.306469 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.306496 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.306505 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.306517 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.306526 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:24Z","lastTransitionTime":"2025-12-01T09:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.319783 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:24Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.334092 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:24Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.350901 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:24Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.414687 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.415122 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.415390 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.416230 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.416322 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:24Z","lastTransitionTime":"2025-12-01T09:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.519224 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.519275 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.519287 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.519310 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.519324 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:24Z","lastTransitionTime":"2025-12-01T09:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.622453 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.622931 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.622941 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.622956 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.622967 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:24Z","lastTransitionTime":"2025-12-01T09:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.725249 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.725612 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.725798 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.726070 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.726248 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:24Z","lastTransitionTime":"2025-12-01T09:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.829244 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.829271 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.829280 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.829293 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.829301 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:24Z","lastTransitionTime":"2025-12-01T09:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.932404 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.932453 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.932471 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.932492 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:24 crc kubenswrapper[4867]: I1201 09:08:24.932508 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:24Z","lastTransitionTime":"2025-12-01T09:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.004720 4867 generic.go:334] "Generic (PLEG): container finished" podID="6494ebd3-57c2-4d65-b44a-3e30e76910a9" containerID="abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17" exitCode=0 Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.004800 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" event={"ID":"6494ebd3-57c2-4d65-b44a-3e30e76910a9","Type":"ContainerDied","Data":"abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17"} Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.028932 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:25Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.042277 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.042643 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.042862 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.043032 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.043281 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:25Z","lastTransitionTime":"2025-12-01T09:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.051231 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:25Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.068616 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:25Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.101543 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:25Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.117769 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:25Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.133709 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:25Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.147533 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.147576 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.147590 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.147605 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.147617 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:25Z","lastTransitionTime":"2025-12-01T09:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.147908 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:25Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.166478 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:25Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.181015 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:25Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.195880 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:25Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.209127 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:25Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.226090 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:25Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.243264 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:25Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.253531 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.253579 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.253591 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.253610 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.253625 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:25Z","lastTransitionTime":"2025-12-01T09:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.258803 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:25Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.267926 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:25Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.356139 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.356422 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.356536 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.356632 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.356711 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:25Z","lastTransitionTime":"2025-12-01T09:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.459052 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.459442 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.459557 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.459654 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:25 crc kubenswrapper[4867]: E1201 09:08:25.459842 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:08:25 crc kubenswrapper[4867]: E1201 09:08:25.459958 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:33.459942641 +0000 UTC m=+34.919329415 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:08:25 crc kubenswrapper[4867]: E1201 09:08:25.460119 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:08:33.460108305 +0000 UTC m=+34.919495079 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:08:25 crc kubenswrapper[4867]: E1201 09:08:25.460271 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:08:25 crc kubenswrapper[4867]: E1201 09:08:25.460354 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:08:25 crc kubenswrapper[4867]: E1201 09:08:25.460428 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:25 crc kubenswrapper[4867]: E1201 09:08:25.460536 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:33.460524616 +0000 UTC m=+34.919911390 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:25 crc kubenswrapper[4867]: E1201 09:08:25.460700 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:08:25 crc kubenswrapper[4867]: E1201 09:08:25.460825 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:33.460797624 +0000 UTC m=+34.920184398 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.461627 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.461756 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.461998 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.462137 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.462254 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:25Z","lastTransitionTime":"2025-12-01T09:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:25 crc kubenswrapper[4867]: E1201 09:08:25.560763 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:08:25 crc kubenswrapper[4867]: E1201 09:08:25.561154 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:08:25 crc kubenswrapper[4867]: E1201 09:08:25.561296 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:25 crc kubenswrapper[4867]: E1201 09:08:25.561616 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:33.561592856 +0000 UTC m=+35.020979640 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.560583 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.565368 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.565413 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.565424 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.565439 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.565450 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:25Z","lastTransitionTime":"2025-12-01T09:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.667110 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.667314 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.667403 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.667470 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.667525 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:25Z","lastTransitionTime":"2025-12-01T09:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.769915 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.769970 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.769984 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.770008 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.770025 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:25Z","lastTransitionTime":"2025-12-01T09:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.826173 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.826434 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:25 crc kubenswrapper[4867]: E1201 09:08:25.826507 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.826700 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:25 crc kubenswrapper[4867]: E1201 09:08:25.826693 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:25 crc kubenswrapper[4867]: E1201 09:08:25.827133 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.872650 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.872682 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.872690 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.872702 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.872715 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:25Z","lastTransitionTime":"2025-12-01T09:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.974835 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.974884 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.974895 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.974912 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:25 crc kubenswrapper[4867]: I1201 09:08:25.974926 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:25Z","lastTransitionTime":"2025-12-01T09:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.077283 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.077326 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.077337 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.077354 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.077365 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:26Z","lastTransitionTime":"2025-12-01T09:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.179428 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.179462 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.179473 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.179488 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.179499 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:26Z","lastTransitionTime":"2025-12-01T09:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.281518 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.281555 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.281589 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.281607 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.281618 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:26Z","lastTransitionTime":"2025-12-01T09:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.383487 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.383531 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.383541 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.383557 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.383566 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:26Z","lastTransitionTime":"2025-12-01T09:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.486207 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.486236 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.486244 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.486256 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.486264 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:26Z","lastTransitionTime":"2025-12-01T09:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.588504 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.588558 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.588568 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.588585 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.588596 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:26Z","lastTransitionTime":"2025-12-01T09:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.691042 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.691110 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.691133 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.691164 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.691187 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:26Z","lastTransitionTime":"2025-12-01T09:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.795000 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.795412 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.795427 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.795856 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.795907 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:26Z","lastTransitionTime":"2025-12-01T09:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.898650 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.898683 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.898692 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.898709 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:26 crc kubenswrapper[4867]: I1201 09:08:26.898720 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:26Z","lastTransitionTime":"2025-12-01T09:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.001349 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.001406 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.001427 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.001450 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.001468 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:27Z","lastTransitionTime":"2025-12-01T09:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.014456 4867 generic.go:334] "Generic (PLEG): container finished" podID="6494ebd3-57c2-4d65-b44a-3e30e76910a9" containerID="399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc" exitCode=0 Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.014535 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" event={"ID":"6494ebd3-57c2-4d65-b44a-3e30e76910a9","Type":"ContainerDied","Data":"399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc"} Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.026813 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerStarted","Data":"e111ebab6bf14414694fb2717f4e7c184bf3f7f19533459ed402d0b7fe3735fc"} Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.027620 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.027690 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.035924 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.055086 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.059055 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.068334 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.073537 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.097297 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.104373 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.104403 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.104411 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.104423 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.104431 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:27Z","lastTransitionTime":"2025-12-01T09:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.108941 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.117929 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.128938 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.137645 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.146585 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.158755 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.175858 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.186752 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.198843 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.205945 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.205974 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.205982 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.205994 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.206003 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:27Z","lastTransitionTime":"2025-12-01T09:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.211170 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.224326 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.236531 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.254317 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.265561 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.275654 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.284418 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.296714 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.307859 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.307891 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.307899 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.307912 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.307921 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:27Z","lastTransitionTime":"2025-12-01T09:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.315759 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.329389 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.339619 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.355588 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.369345 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.380923 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.392067 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.408258 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.410593 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.410626 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.410636 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.410652 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.410663 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:27Z","lastTransitionTime":"2025-12-01T09:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.426861 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e111ebab6bf14414694fb2717f4e7c184bf3f7f19533459ed402d0b7fe3735fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:27Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.513496 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.513527 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.513537 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.513549 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.513558 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:27Z","lastTransitionTime":"2025-12-01T09:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.616048 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.616085 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.616095 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.616109 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.616118 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:27Z","lastTransitionTime":"2025-12-01T09:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.718625 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.718654 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.718663 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.718675 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.718685 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:27Z","lastTransitionTime":"2025-12-01T09:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.820802 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.820851 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.820861 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.820878 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.820889 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:27Z","lastTransitionTime":"2025-12-01T09:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.826522 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.826590 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.826537 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:27 crc kubenswrapper[4867]: E1201 09:08:27.826651 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:27 crc kubenswrapper[4867]: E1201 09:08:27.826780 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:27 crc kubenswrapper[4867]: E1201 09:08:27.826924 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.924120 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.924190 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.924208 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.924737 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:27 crc kubenswrapper[4867]: I1201 09:08:27.924800 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:27Z","lastTransitionTime":"2025-12-01T09:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.028152 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.028180 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.028188 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.028200 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.028209 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:28Z","lastTransitionTime":"2025-12-01T09:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.032073 4867 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.033361 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" event={"ID":"6494ebd3-57c2-4d65-b44a-3e30e76910a9","Type":"ContainerStarted","Data":"238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c"} Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.055264 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.085983 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.097546 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.115877 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e111ebab6bf14414694fb2717f4e7c184bf3f7f19533459ed402d0b7fe3735fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.129348 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.130171 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.130196 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.130206 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.130222 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.130232 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:28Z","lastTransitionTime":"2025-12-01T09:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.139383 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.151911 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.163137 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.172753 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.190835 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.205012 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.223833 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.232915 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.232968 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.232982 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.233000 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.233012 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:28Z","lastTransitionTime":"2025-12-01T09:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.237289 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.258437 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.272710 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.335034 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.335076 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.335087 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.335128 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.335139 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:28Z","lastTransitionTime":"2025-12-01T09:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.437974 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.438035 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.438051 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.438079 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.438099 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:28Z","lastTransitionTime":"2025-12-01T09:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.540366 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.540403 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.540413 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.540429 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.540440 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:28Z","lastTransitionTime":"2025-12-01T09:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.642755 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.642791 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.642803 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.642834 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.642847 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:28Z","lastTransitionTime":"2025-12-01T09:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.745574 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.745625 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.745636 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.745653 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.745664 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:28Z","lastTransitionTime":"2025-12-01T09:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.848251 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.848277 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.848285 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.848297 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.848306 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:28Z","lastTransitionTime":"2025-12-01T09:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.849290 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.867607 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.881425 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.906215 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.922425 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.940472 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.951022 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.951085 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.951109 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.951138 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.951160 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:28Z","lastTransitionTime":"2025-12-01T09:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.959897 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.981366 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:28 crc kubenswrapper[4867]: I1201 09:08:28.995581 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.021322 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.032916 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.036703 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kk2hn_8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/ovnkube-controller/0.log" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.038775 4867 generic.go:334] "Generic (PLEG): container finished" podID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerID="e111ebab6bf14414694fb2717f4e7c184bf3f7f19533459ed402d0b7fe3735fc" exitCode=1 Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.039682 4867 scope.go:117] "RemoveContainer" containerID="e111ebab6bf14414694fb2717f4e7c184bf3f7f19533459ed402d0b7fe3735fc" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.039859 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerDied","Data":"e111ebab6bf14414694fb2717f4e7c184bf3f7f19533459ed402d0b7fe3735fc"} Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.050489 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.054089 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.054145 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.054162 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.054185 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.054202 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:29Z","lastTransitionTime":"2025-12-01T09:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.071858 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e111ebab6bf14414694fb2717f4e7c184bf3f7f19533459ed402d0b7fe3735fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.088123 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.099473 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.109892 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.120087 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.130759 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.143410 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.153654 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.156467 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.156502 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.156518 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.156541 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.156556 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:29Z","lastTransitionTime":"2025-12-01T09:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.166582 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.191371 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.210426 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.223795 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.245043 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.259202 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.259403 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.259440 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.259451 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.259466 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.259480 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:29Z","lastTransitionTime":"2025-12-01T09:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.271244 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.282797 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.304957 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e111ebab6bf14414694fb2717f4e7c184bf3f7f19533459ed402d0b7fe3735fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e111ebab6bf14414694fb2717f4e7c184bf3f7f19533459ed402d0b7fe3735fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"message\\\":\\\"(0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1201 09:08:28.562310 6072 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:08:28.562588 6072 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:08:28.562847 6072 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:08:28.562883 6072 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 09:08:28.563183 6072 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:08:28.563198 6072 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:08:28.563237 6072 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 09:08:28.563277 6072 factory.go:656] Stopping watch factory\\\\nI1201 09:08:28.563293 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 09:08:28.563274 6072 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:08:28.563358 6072 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.319053 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.361599 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.361656 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.361670 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.361689 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.361700 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:29Z","lastTransitionTime":"2025-12-01T09:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.463707 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.463751 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.463760 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.463774 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.463782 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:29Z","lastTransitionTime":"2025-12-01T09:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.565759 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.565830 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.565844 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.565862 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.565874 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:29Z","lastTransitionTime":"2025-12-01T09:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.668330 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.668361 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.668370 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.668384 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.668393 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:29Z","lastTransitionTime":"2025-12-01T09:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.769999 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.770032 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.770040 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.770053 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.770061 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:29Z","lastTransitionTime":"2025-12-01T09:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.826699 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.826699 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.826827 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:29 crc kubenswrapper[4867]: E1201 09:08:29.826855 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:29 crc kubenswrapper[4867]: E1201 09:08:29.826896 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:29 crc kubenswrapper[4867]: E1201 09:08:29.827034 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.872229 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.872274 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.872282 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.872296 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.872306 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:29Z","lastTransitionTime":"2025-12-01T09:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.975935 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.975980 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.975989 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.976002 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:29 crc kubenswrapper[4867]: I1201 09:08:29.976012 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:29Z","lastTransitionTime":"2025-12-01T09:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.045911 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kk2hn_8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/ovnkube-controller/0.log" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.050015 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerStarted","Data":"51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840"} Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.050213 4867 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.074500 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.079145 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.079197 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.079212 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.079230 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.079242 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:30Z","lastTransitionTime":"2025-12-01T09:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.090717 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.113963 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.133728 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.149643 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.172298 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2"] Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.173023 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.176566 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.178090 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.181435 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.181479 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.181495 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.181518 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.181535 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:30Z","lastTransitionTime":"2025-12-01T09:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.191865 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.212156 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.221085 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e8b0a64a-1b1c-49e2-9715-29505b2c124b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zd2r2\" (UID: \"e8b0a64a-1b1c-49e2-9715-29505b2c124b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.221146 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e8b0a64a-1b1c-49e2-9715-29505b2c124b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zd2r2\" (UID: \"e8b0a64a-1b1c-49e2-9715-29505b2c124b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.221245 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfp4w\" (UniqueName: \"kubernetes.io/projected/e8b0a64a-1b1c-49e2-9715-29505b2c124b-kube-api-access-qfp4w\") pod \"ovnkube-control-plane-749d76644c-zd2r2\" (UID: \"e8b0a64a-1b1c-49e2-9715-29505b2c124b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.221546 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e8b0a64a-1b1c-49e2-9715-29505b2c124b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zd2r2\" (UID: \"e8b0a64a-1b1c-49e2-9715-29505b2c124b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.233264 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.253621 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.278243 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.284254 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.284337 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.284354 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.284383 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.284402 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:30Z","lastTransitionTime":"2025-12-01T09:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.302493 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.323207 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e8b0a64a-1b1c-49e2-9715-29505b2c124b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zd2r2\" (UID: \"e8b0a64a-1b1c-49e2-9715-29505b2c124b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.323260 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e8b0a64a-1b1c-49e2-9715-29505b2c124b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zd2r2\" (UID: \"e8b0a64a-1b1c-49e2-9715-29505b2c124b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.323304 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfp4w\" (UniqueName: \"kubernetes.io/projected/e8b0a64a-1b1c-49e2-9715-29505b2c124b-kube-api-access-qfp4w\") pod \"ovnkube-control-plane-749d76644c-zd2r2\" (UID: \"e8b0a64a-1b1c-49e2-9715-29505b2c124b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.323367 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e8b0a64a-1b1c-49e2-9715-29505b2c124b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zd2r2\" (UID: \"e8b0a64a-1b1c-49e2-9715-29505b2c124b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.324216 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e8b0a64a-1b1c-49e2-9715-29505b2c124b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zd2r2\" (UID: \"e8b0a64a-1b1c-49e2-9715-29505b2c124b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.324884 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.325139 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e8b0a64a-1b1c-49e2-9715-29505b2c124b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zd2r2\" (UID: \"e8b0a64a-1b1c-49e2-9715-29505b2c124b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.329559 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e8b0a64a-1b1c-49e2-9715-29505b2c124b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zd2r2\" (UID: \"e8b0a64a-1b1c-49e2-9715-29505b2c124b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.343452 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.359734 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfp4w\" (UniqueName: \"kubernetes.io/projected/e8b0a64a-1b1c-49e2-9715-29505b2c124b-kube-api-access-qfp4w\") pod \"ovnkube-control-plane-749d76644c-zd2r2\" (UID: \"e8b0a64a-1b1c-49e2-9715-29505b2c124b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.367284 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.387111 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.387163 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.387174 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.387190 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.387201 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:30Z","lastTransitionTime":"2025-12-01T09:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.390900 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e111ebab6bf14414694fb2717f4e7c184bf3f7f19533459ed402d0b7fe3735fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"message\\\":\\\"(0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1201 09:08:28.562310 6072 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:08:28.562588 6072 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:08:28.562847 6072 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:08:28.562883 6072 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 09:08:28.563183 6072 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:08:28.563198 6072 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:08:28.563237 6072 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 09:08:28.563277 6072 factory.go:656] Stopping watch factory\\\\nI1201 09:08:28.563293 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 09:08:28.563274 6072 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:08:28.563358 6072 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.408595 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.428895 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.444224 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.471522 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e111ebab6bf14414694fb2717f4e7c184bf3f7f19533459ed402d0b7fe3735fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"message\\\":\\\"(0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1201 09:08:28.562310 6072 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:08:28.562588 6072 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:08:28.562847 6072 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:08:28.562883 6072 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 09:08:28.563183 6072 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:08:28.563198 6072 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:08:28.563237 6072 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 09:08:28.563277 6072 factory.go:656] Stopping watch factory\\\\nI1201 09:08:28.563293 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 09:08:28.563274 6072 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:08:28.563358 6072 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.485170 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.489666 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.489709 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.489724 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.489745 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.489760 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:30Z","lastTransitionTime":"2025-12-01T09:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.495895 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.499543 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: W1201 09:08:30.510682 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8b0a64a_1b1c_49e2_9715_29505b2c124b.slice/crio-832c7be22d7186dc2175738db19a8b99816075a4768aea5a6e59def75c8f2b86 WatchSource:0}: Error finding container 832c7be22d7186dc2175738db19a8b99816075a4768aea5a6e59def75c8f2b86: Status 404 returned error can't find the container with id 832c7be22d7186dc2175738db19a8b99816075a4768aea5a6e59def75c8f2b86 Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.515556 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b0a64a-1b1c-49e2-9715-29505b2c124b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zd2r2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.531614 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.547912 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.548232 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.548280 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.548293 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.548310 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.548323 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:30Z","lastTransitionTime":"2025-12-01T09:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.559670 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: E1201 09:08:30.563166 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.566973 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.567013 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.567024 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.567040 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.567051 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:30Z","lastTransitionTime":"2025-12-01T09:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:30 crc kubenswrapper[4867]: E1201 09:08:30.582942 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.583600 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.587006 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.587044 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.587056 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.587073 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.587084 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:30Z","lastTransitionTime":"2025-12-01T09:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.598852 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: E1201 09:08:30.601398 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.605099 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.605129 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.605140 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.605155 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.605163 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:30Z","lastTransitionTime":"2025-12-01T09:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.612240 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: E1201 09:08:30.619717 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.625140 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.625527 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.625576 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.625590 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.625606 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.626077 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:30Z","lastTransitionTime":"2025-12-01T09:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:30 crc kubenswrapper[4867]: E1201 09:08:30.638602 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: E1201 09:08:30.639180 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.641230 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.642828 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.642895 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.642909 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.642926 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.643345 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:30Z","lastTransitionTime":"2025-12-01T09:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.656160 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.705411 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.716318 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.726035 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.737754 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.745754 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.745943 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.745966 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.745983 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.745994 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:30Z","lastTransitionTime":"2025-12-01T09:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.748988 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.760351 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.770956 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.785306 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.796669 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.814409 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.825245 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.840763 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.848924 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.849089 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.849148 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.849224 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.849279 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:30Z","lastTransitionTime":"2025-12-01T09:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.863042 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e111ebab6bf14414694fb2717f4e7c184bf3f7f19533459ed402d0b7fe3735fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"message\\\":\\\"(0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1201 09:08:28.562310 6072 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:08:28.562588 6072 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:08:28.562847 6072 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:08:28.562883 6072 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 09:08:28.563183 6072 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:08:28.563198 6072 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:08:28.563237 6072 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 09:08:28.563277 6072 factory.go:656] Stopping watch factory\\\\nI1201 09:08:28.563293 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 09:08:28.563274 6072 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:08:28.563358 6072 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.878272 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.892060 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.904421 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.915685 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b0a64a-1b1c-49e2-9715-29505b2c124b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zd2r2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.951834 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.952041 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.952124 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.952197 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:30 crc kubenswrapper[4867]: I1201 09:08:30.952300 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:30Z","lastTransitionTime":"2025-12-01T09:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.053898 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.053937 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.053946 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.053960 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.053971 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:31Z","lastTransitionTime":"2025-12-01T09:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.056606 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" event={"ID":"e8b0a64a-1b1c-49e2-9715-29505b2c124b","Type":"ContainerStarted","Data":"832c7be22d7186dc2175738db19a8b99816075a4768aea5a6e59def75c8f2b86"} Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.058348 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kk2hn_8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/ovnkube-controller/1.log" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.058923 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kk2hn_8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/ovnkube-controller/0.log" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.061191 4867 generic.go:334] "Generic (PLEG): container finished" podID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerID="51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840" exitCode=1 Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.061216 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerDied","Data":"51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840"} Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.061250 4867 scope.go:117] "RemoveContainer" containerID="e111ebab6bf14414694fb2717f4e7c184bf3f7f19533459ed402d0b7fe3735fc" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.061846 4867 scope.go:117] "RemoveContainer" containerID="51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840" Dec 01 09:08:31 crc kubenswrapper[4867]: E1201 09:08:31.062218 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.080600 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.092017 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.104002 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.115040 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.127069 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.139237 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.151125 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.155839 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.155869 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.155881 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.155900 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.155911 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:31Z","lastTransitionTime":"2025-12-01T09:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.166429 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.186614 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.197170 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.206757 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.223609 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e111ebab6bf14414694fb2717f4e7c184bf3f7f19533459ed402d0b7fe3735fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"message\\\":\\\"(0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1201 09:08:28.562310 6072 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:08:28.562588 6072 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:08:28.562847 6072 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:08:28.562883 6072 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 09:08:28.563183 6072 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:08:28.563198 6072 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:08:28.563237 6072 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 09:08:28.563277 6072 factory.go:656] Stopping watch factory\\\\nI1201 09:08:28.563293 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 09:08:28.563274 6072 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:08:28.563358 6072 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"erver/kube-apiserver-crc in node crc\\\\nI1201 09:08:29.861944 6198 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1201 09:08:29.861948 6198 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1201 09:08:29.861954 6198 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF1201 09:08:29.861838 6198 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z]\\\\nI1201 09:08:29.861959 6198 obj_retry.go:365] Adding \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.237265 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.251356 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.258214 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.258247 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.258258 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.258275 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.258285 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:31Z","lastTransitionTime":"2025-12-01T09:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.260530 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.274052 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b0a64a-1b1c-49e2-9715-29505b2c124b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zd2r2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.380800 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.380856 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.380866 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.380882 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.380891 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:31Z","lastTransitionTime":"2025-12-01T09:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.483038 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.483067 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.483075 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.483087 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.483096 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:31Z","lastTransitionTime":"2025-12-01T09:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.585159 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.585432 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.585506 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.585603 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.585684 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:31Z","lastTransitionTime":"2025-12-01T09:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.688027 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.688065 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.688075 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.688090 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.688100 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:31Z","lastTransitionTime":"2025-12-01T09:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.711986 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-n7wvd"] Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.717241 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:31 crc kubenswrapper[4867]: E1201 09:08:31.717403 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.734382 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.744682 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.758006 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b0a64a-1b1c-49e2-9715-29505b2c124b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zd2r2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.771195 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.783262 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.789682 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.789725 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.789735 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.789750 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.789760 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:31Z","lastTransitionTime":"2025-12-01T09:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.795987 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.805341 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.823763 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.826098 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.826142 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:31 crc kubenswrapper[4867]: E1201 09:08:31.826207 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:31 crc kubenswrapper[4867]: E1201 09:08:31.826273 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.826444 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:31 crc kubenswrapper[4867]: E1201 09:08:31.826565 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.836871 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.848120 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.861832 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.875344 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.883896 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv845\" (UniqueName: \"kubernetes.io/projected/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-kube-api-access-qv845\") pod \"network-metrics-daemon-n7wvd\" (UID: \"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\") " pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.884023 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs\") pod \"network-metrics-daemon-n7wvd\" (UID: \"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\") " pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.886954 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.891523 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.891556 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.891564 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.891576 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.891585 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:31Z","lastTransitionTime":"2025-12-01T09:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.899607 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.909724 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.918734 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.933726 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e111ebab6bf14414694fb2717f4e7c184bf3f7f19533459ed402d0b7fe3735fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"message\\\":\\\"(0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1201 09:08:28.562310 6072 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:08:28.562588 6072 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:08:28.562847 6072 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:08:28.562883 6072 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 09:08:28.563183 6072 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:08:28.563198 6072 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:08:28.563237 6072 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 09:08:28.563277 6072 factory.go:656] Stopping watch factory\\\\nI1201 09:08:28.563293 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 09:08:28.563274 6072 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:08:28.563358 6072 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"erver/kube-apiserver-crc in node crc\\\\nI1201 09:08:29.861944 6198 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1201 09:08:29.861948 6198 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1201 09:08:29.861954 6198 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF1201 09:08:29.861838 6198 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z]\\\\nI1201 09:08:29.861959 6198 obj_retry.go:365] Adding \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.985339 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv845\" (UniqueName: \"kubernetes.io/projected/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-kube-api-access-qv845\") pod \"network-metrics-daemon-n7wvd\" (UID: \"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\") " pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.985407 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs\") pod \"network-metrics-daemon-n7wvd\" (UID: \"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\") " pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:31 crc kubenswrapper[4867]: E1201 09:08:31.985577 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:08:31 crc kubenswrapper[4867]: E1201 09:08:31.985673 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs podName:c3ff1be1-b98b-483b-83ca-eb2255f66c7c nodeName:}" failed. No retries permitted until 2025-12-01 09:08:32.485651272 +0000 UTC m=+33.945038066 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs") pod "network-metrics-daemon-n7wvd" (UID: "c3ff1be1-b98b-483b-83ca-eb2255f66c7c") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.994256 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.994320 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.994344 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.994387 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:31 crc kubenswrapper[4867]: I1201 09:08:31.994415 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:31Z","lastTransitionTime":"2025-12-01T09:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.008594 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv845\" (UniqueName: \"kubernetes.io/projected/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-kube-api-access-qv845\") pod \"network-metrics-daemon-n7wvd\" (UID: \"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\") " pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.066108 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kk2hn_8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/ovnkube-controller/1.log" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.071898 4867 scope.go:117] "RemoveContainer" containerID="51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.072091 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" event={"ID":"e8b0a64a-1b1c-49e2-9715-29505b2c124b","Type":"ContainerStarted","Data":"af8ef02ae3c8d182752ab2174f0f96fb7a2ddc2652364fd79f9cfe6ab7bcb248"} Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.072115 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" event={"ID":"e8b0a64a-1b1c-49e2-9715-29505b2c124b","Type":"ContainerStarted","Data":"4d5c06d9135265045d53f83c60fee67d6c6a6a029ca6c6f1d87f0db56f7f5e92"} Dec 01 09:08:32 crc kubenswrapper[4867]: E1201 09:08:32.072212 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.096750 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.096800 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.096846 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.096868 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.096882 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:32Z","lastTransitionTime":"2025-12-01T09:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.104880 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.119754 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.136088 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.154530 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.169897 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.184279 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.200062 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.200129 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.200148 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.200174 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.200193 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:32Z","lastTransitionTime":"2025-12-01T09:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.207620 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.224444 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.241661 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.269950 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"erver/kube-apiserver-crc in node crc\\\\nI1201 09:08:29.861944 6198 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1201 09:08:29.861948 6198 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1201 09:08:29.861954 6198 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF1201 09:08:29.861838 6198 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z]\\\\nI1201 09:08:29.861959 6198 obj_retry.go:365] Adding \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.283289 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.294010 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.302200 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.302249 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.302261 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.302277 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.302289 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:32Z","lastTransitionTime":"2025-12-01T09:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.309512 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b0a64a-1b1c-49e2-9715-29505b2c124b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zd2r2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.324253 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.347591 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.363209 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.373040 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.386288 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.397189 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.404411 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.404601 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.404699 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.404774 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.404868 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:32Z","lastTransitionTime":"2025-12-01T09:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.408611 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.419529 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.436715 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.449071 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.469629 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.482528 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.488986 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs\") pod \"network-metrics-daemon-n7wvd\" (UID: \"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\") " pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:32 crc kubenswrapper[4867]: E1201 09:08:32.489101 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:08:32 crc kubenswrapper[4867]: E1201 09:08:32.489273 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs podName:c3ff1be1-b98b-483b-83ca-eb2255f66c7c nodeName:}" failed. No retries permitted until 2025-12-01 09:08:33.489255701 +0000 UTC m=+34.948642455 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs") pod "network-metrics-daemon-n7wvd" (UID: "c3ff1be1-b98b-483b-83ca-eb2255f66c7c") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.493892 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.504485 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.506589 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.506619 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.506629 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.506643 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.506652 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:32Z","lastTransitionTime":"2025-12-01T09:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.522560 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"erver/kube-apiserver-crc in node crc\\\\nI1201 09:08:29.861944 6198 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1201 09:08:29.861948 6198 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1201 09:08:29.861954 6198 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF1201 09:08:29.861838 6198 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z]\\\\nI1201 09:08:29.861959 6198 obj_retry.go:365] Adding \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.536208 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.548292 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.581969 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b0a64a-1b1c-49e2-9715-29505b2c124b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5c06d9135265045d53f83c60fee67d6c6a6a029ca6c6f1d87f0db56f7f5e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af8ef02ae3c8d182752ab2174f0f96fb7a2ddc2652364fd79f9cfe6ab7bcb248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zd2r2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.609045 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.609081 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.609089 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.609101 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.609109 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:32Z","lastTransitionTime":"2025-12-01T09:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.624295 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.661215 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.703178 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.712150 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.712193 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.712205 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.712222 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.712235 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:32Z","lastTransitionTime":"2025-12-01T09:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.815400 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.815467 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.815486 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.815508 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.815524 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:32Z","lastTransitionTime":"2025-12-01T09:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.919174 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.919649 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.919800 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.919935 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:32 crc kubenswrapper[4867]: I1201 09:08:32.920004 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:32Z","lastTransitionTime":"2025-12-01T09:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.023098 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.023162 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.023186 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.023215 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.023240 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:33Z","lastTransitionTime":"2025-12-01T09:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.125928 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.125990 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.126007 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.126032 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.126050 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:33Z","lastTransitionTime":"2025-12-01T09:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.228634 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.228692 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.228709 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.228735 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.228791 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:33Z","lastTransitionTime":"2025-12-01T09:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.331840 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.331877 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.331888 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.331903 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.331914 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:33Z","lastTransitionTime":"2025-12-01T09:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.435121 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.435196 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.435223 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.435257 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.435293 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:33Z","lastTransitionTime":"2025-12-01T09:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.500219 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.500389 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs\") pod \"network-metrics-daemon-n7wvd\" (UID: \"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\") " pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.500439 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.500470 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:33 crc kubenswrapper[4867]: E1201 09:08:33.500642 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.500516 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:33 crc kubenswrapper[4867]: E1201 09:08:33.500651 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:08:33 crc kubenswrapper[4867]: E1201 09:08:33.500658 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:08:33 crc kubenswrapper[4867]: E1201 09:08:33.500840 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:08:33 crc kubenswrapper[4867]: E1201 09:08:33.500860 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:08:33 crc kubenswrapper[4867]: E1201 09:08:33.500937 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:33 crc kubenswrapper[4867]: E1201 09:08:33.500737 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs podName:c3ff1be1-b98b-483b-83ca-eb2255f66c7c nodeName:}" failed. No retries permitted until 2025-12-01 09:08:35.500713145 +0000 UTC m=+36.960099939 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs") pod "network-metrics-daemon-n7wvd" (UID: "c3ff1be1-b98b-483b-83ca-eb2255f66c7c") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:08:33 crc kubenswrapper[4867]: E1201 09:08:33.501008 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:49.500986102 +0000 UTC m=+50.960372896 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:08:33 crc kubenswrapper[4867]: E1201 09:08:33.501044 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:49.501030214 +0000 UTC m=+50.960416998 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:08:33 crc kubenswrapper[4867]: E1201 09:08:33.501074 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:49.501062625 +0000 UTC m=+50.960449409 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:33 crc kubenswrapper[4867]: E1201 09:08:33.501305 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:08:49.50128591 +0000 UTC m=+50.960672704 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.538455 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.538513 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.538530 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.538581 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.538601 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:33Z","lastTransitionTime":"2025-12-01T09:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.601879 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:33 crc kubenswrapper[4867]: E1201 09:08:33.602169 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:08:33 crc kubenswrapper[4867]: E1201 09:08:33.602220 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:08:33 crc kubenswrapper[4867]: E1201 09:08:33.602240 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:33 crc kubenswrapper[4867]: E1201 09:08:33.602325 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:08:49.602302679 +0000 UTC m=+51.061689463 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.641374 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.641470 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.641494 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.641541 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.641564 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:33Z","lastTransitionTime":"2025-12-01T09:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.744736 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.744804 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.744872 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.744901 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.744926 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:33Z","lastTransitionTime":"2025-12-01T09:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.826399 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.826442 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.826445 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:33 crc kubenswrapper[4867]: E1201 09:08:33.826596 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.826636 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:33 crc kubenswrapper[4867]: E1201 09:08:33.826760 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:33 crc kubenswrapper[4867]: E1201 09:08:33.826889 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:33 crc kubenswrapper[4867]: E1201 09:08:33.826977 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.847044 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.847112 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.847136 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.847166 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.847192 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:33Z","lastTransitionTime":"2025-12-01T09:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.949788 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.949890 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.949908 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.949930 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:33 crc kubenswrapper[4867]: I1201 09:08:33.949946 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:33Z","lastTransitionTime":"2025-12-01T09:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.053306 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.053353 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.053368 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.053386 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.053398 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:34Z","lastTransitionTime":"2025-12-01T09:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.156180 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.156232 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.156248 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.156270 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.156284 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:34Z","lastTransitionTime":"2025-12-01T09:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.258524 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.258582 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.258591 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.258604 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.258612 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:34Z","lastTransitionTime":"2025-12-01T09:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.361863 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.361914 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.361927 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.361948 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.361963 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:34Z","lastTransitionTime":"2025-12-01T09:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.464659 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.464704 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.464715 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.464735 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.464749 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:34Z","lastTransitionTime":"2025-12-01T09:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.566661 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.566705 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.566717 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.566732 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.566743 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:34Z","lastTransitionTime":"2025-12-01T09:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.669625 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.669679 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.669687 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.669701 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.669709 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:34Z","lastTransitionTime":"2025-12-01T09:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.771547 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.771596 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.771611 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.771630 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.771645 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:34Z","lastTransitionTime":"2025-12-01T09:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.874211 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.874253 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.874266 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.874282 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.874298 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:34Z","lastTransitionTime":"2025-12-01T09:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.976875 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.976921 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.976939 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.976959 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:34 crc kubenswrapper[4867]: I1201 09:08:34.976973 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:34Z","lastTransitionTime":"2025-12-01T09:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.079379 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.079431 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.079439 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.079452 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.079461 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:35Z","lastTransitionTime":"2025-12-01T09:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.181607 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.181643 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.181653 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.181666 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.181675 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:35Z","lastTransitionTime":"2025-12-01T09:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.284995 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.285069 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.285092 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.285115 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.285133 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:35Z","lastTransitionTime":"2025-12-01T09:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.388649 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.388716 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.388743 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.388772 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.388795 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:35Z","lastTransitionTime":"2025-12-01T09:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.491675 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.491738 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.491760 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.491787 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.491808 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:35Z","lastTransitionTime":"2025-12-01T09:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.522517 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs\") pod \"network-metrics-daemon-n7wvd\" (UID: \"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\") " pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:35 crc kubenswrapper[4867]: E1201 09:08:35.522666 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:08:35 crc kubenswrapper[4867]: E1201 09:08:35.522750 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs podName:c3ff1be1-b98b-483b-83ca-eb2255f66c7c nodeName:}" failed. No retries permitted until 2025-12-01 09:08:39.522724279 +0000 UTC m=+40.982111063 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs") pod "network-metrics-daemon-n7wvd" (UID: "c3ff1be1-b98b-483b-83ca-eb2255f66c7c") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.594135 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.594181 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.594199 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.594218 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.594233 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:35Z","lastTransitionTime":"2025-12-01T09:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.696995 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.697046 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.697062 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.697084 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.697102 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:35Z","lastTransitionTime":"2025-12-01T09:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.799586 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.799656 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.799682 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.799713 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.799736 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:35Z","lastTransitionTime":"2025-12-01T09:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.826457 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.826510 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:35 crc kubenswrapper[4867]: E1201 09:08:35.826975 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.826634 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:35 crc kubenswrapper[4867]: E1201 09:08:35.827209 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:08:35 crc kubenswrapper[4867]: E1201 09:08:35.827084 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.826585 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:35 crc kubenswrapper[4867]: E1201 09:08:35.827465 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.903048 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.903118 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.903144 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.903178 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:35 crc kubenswrapper[4867]: I1201 09:08:35.903204 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:35Z","lastTransitionTime":"2025-12-01T09:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.006415 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.006805 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.007065 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.007260 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.007480 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:36Z","lastTransitionTime":"2025-12-01T09:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.109835 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.110080 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.110163 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.110225 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.110294 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:36Z","lastTransitionTime":"2025-12-01T09:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.213502 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.213572 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.213596 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.213624 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.213645 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:36Z","lastTransitionTime":"2025-12-01T09:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.318296 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.318398 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.318418 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.318444 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.318461 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:36Z","lastTransitionTime":"2025-12-01T09:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.424219 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.424254 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.424263 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.424280 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.424290 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:36Z","lastTransitionTime":"2025-12-01T09:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.526609 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.526671 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.526690 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.526711 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.526728 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:36Z","lastTransitionTime":"2025-12-01T09:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.629038 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.629108 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.629119 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.629131 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.629139 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:36Z","lastTransitionTime":"2025-12-01T09:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.731532 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.731594 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.731611 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.731659 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.731675 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:36Z","lastTransitionTime":"2025-12-01T09:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.833174 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.833231 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.833249 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.833271 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.833288 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:36Z","lastTransitionTime":"2025-12-01T09:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.935304 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.935380 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.935404 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.935435 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:36 crc kubenswrapper[4867]: I1201 09:08:36.935460 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:36Z","lastTransitionTime":"2025-12-01T09:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.037929 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.038004 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.038023 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.038048 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.038069 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:37Z","lastTransitionTime":"2025-12-01T09:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.140606 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.140935 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.141002 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.141070 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.141166 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:37Z","lastTransitionTime":"2025-12-01T09:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.243661 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.244573 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.244740 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.244991 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.245150 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:37Z","lastTransitionTime":"2025-12-01T09:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.347776 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.348030 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.348094 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.348155 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.348209 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:37Z","lastTransitionTime":"2025-12-01T09:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.450298 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.450342 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.450354 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.450369 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.450378 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:37Z","lastTransitionTime":"2025-12-01T09:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.552899 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.552940 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.552951 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.552967 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.552977 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:37Z","lastTransitionTime":"2025-12-01T09:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.654878 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.655086 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.655161 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.655271 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.655347 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:37Z","lastTransitionTime":"2025-12-01T09:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.758118 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.758188 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.758206 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.758230 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.758248 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:37Z","lastTransitionTime":"2025-12-01T09:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.826162 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.826402 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.826354 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.826207 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:37 crc kubenswrapper[4867]: E1201 09:08:37.826863 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:37 crc kubenswrapper[4867]: E1201 09:08:37.826997 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:08:37 crc kubenswrapper[4867]: E1201 09:08:37.827190 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:37 crc kubenswrapper[4867]: E1201 09:08:37.827276 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.861014 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.861071 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.861086 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.861107 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.861121 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:37Z","lastTransitionTime":"2025-12-01T09:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.963452 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.963486 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.963498 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.963513 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:37 crc kubenswrapper[4867]: I1201 09:08:37.963525 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:37Z","lastTransitionTime":"2025-12-01T09:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.066330 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.066372 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.066398 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.066418 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.066434 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:38Z","lastTransitionTime":"2025-12-01T09:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.168405 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.168652 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.168719 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.168779 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.168870 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:38Z","lastTransitionTime":"2025-12-01T09:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.271663 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.272229 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.272417 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.272594 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.272801 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:38Z","lastTransitionTime":"2025-12-01T09:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.374785 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.374844 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.374854 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.374868 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.374880 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:38Z","lastTransitionTime":"2025-12-01T09:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.477591 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.477871 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.477972 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.478044 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.478113 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:38Z","lastTransitionTime":"2025-12-01T09:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.581419 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.581691 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.581790 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.581969 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.582071 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:38Z","lastTransitionTime":"2025-12-01T09:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.687258 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.687299 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.687316 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.687332 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.687343 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:38Z","lastTransitionTime":"2025-12-01T09:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.790105 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.790140 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.790148 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.790161 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.790171 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:38Z","lastTransitionTime":"2025-12-01T09:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.840233 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.850466 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.859975 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.871080 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.884697 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.892981 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.893010 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.893022 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.893233 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.893248 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:38Z","lastTransitionTime":"2025-12-01T09:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.898040 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.917500 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.928358 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.939843 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.950163 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.966436 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"erver/kube-apiserver-crc in node crc\\\\nI1201 09:08:29.861944 6198 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1201 09:08:29.861948 6198 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1201 09:08:29.861954 6198 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF1201 09:08:29.861838 6198 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z]\\\\nI1201 09:08:29.861959 6198 obj_retry.go:365] Adding \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.980706 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.992599 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.996150 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.996200 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.996213 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.996231 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:38 crc kubenswrapper[4867]: I1201 09:08:38.996242 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:38Z","lastTransitionTime":"2025-12-01T09:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.005427 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b0a64a-1b1c-49e2-9715-29505b2c124b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5c06d9135265045d53f83c60fee67d6c6a6a029ca6c6f1d87f0db56f7f5e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af8ef02ae3c8d182752ab2174f0f96fb7a2ddc2652364fd79f9cfe6ab7bcb248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zd2r2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.015782 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.026077 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.034959 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.097737 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.097770 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.097781 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.097797 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.097833 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:39Z","lastTransitionTime":"2025-12-01T09:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.200893 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.200938 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.200947 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.200968 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.200978 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:39Z","lastTransitionTime":"2025-12-01T09:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.306726 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.306785 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.306795 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.306822 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.306831 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:39Z","lastTransitionTime":"2025-12-01T09:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.410068 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.410110 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.410121 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.410137 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.410149 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:39Z","lastTransitionTime":"2025-12-01T09:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.513475 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.513557 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.513586 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.513615 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.513637 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:39Z","lastTransitionTime":"2025-12-01T09:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.567096 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs\") pod \"network-metrics-daemon-n7wvd\" (UID: \"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\") " pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:39 crc kubenswrapper[4867]: E1201 09:08:39.567243 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:08:39 crc kubenswrapper[4867]: E1201 09:08:39.567302 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs podName:c3ff1be1-b98b-483b-83ca-eb2255f66c7c nodeName:}" failed. No retries permitted until 2025-12-01 09:08:47.567285751 +0000 UTC m=+49.026672505 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs") pod "network-metrics-daemon-n7wvd" (UID: "c3ff1be1-b98b-483b-83ca-eb2255f66c7c") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.616098 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.616157 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.616179 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.616207 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.616230 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:39Z","lastTransitionTime":"2025-12-01T09:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.719283 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.719324 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.719334 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.719350 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.719363 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:39Z","lastTransitionTime":"2025-12-01T09:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.822513 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.822563 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.822573 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.822587 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.822598 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:39Z","lastTransitionTime":"2025-12-01T09:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.826179 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:39 crc kubenswrapper[4867]: E1201 09:08:39.826328 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.826423 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:39 crc kubenswrapper[4867]: E1201 09:08:39.826501 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.826536 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.826634 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:39 crc kubenswrapper[4867]: E1201 09:08:39.826760 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:39 crc kubenswrapper[4867]: E1201 09:08:39.826910 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.924894 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.924936 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.924947 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.924960 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:39 crc kubenswrapper[4867]: I1201 09:08:39.924970 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:39Z","lastTransitionTime":"2025-12-01T09:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.026996 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.027031 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.027040 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.027054 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.027062 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:40Z","lastTransitionTime":"2025-12-01T09:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.129411 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.129463 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.129475 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.129491 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.129501 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:40Z","lastTransitionTime":"2025-12-01T09:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.232350 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.232385 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.232396 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.232411 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.232423 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:40Z","lastTransitionTime":"2025-12-01T09:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.334053 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.334100 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.334113 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.334125 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.334132 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:40Z","lastTransitionTime":"2025-12-01T09:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.436448 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.436483 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.436493 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.436507 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.436517 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:40Z","lastTransitionTime":"2025-12-01T09:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.538859 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.538893 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.538904 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.538954 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.538967 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:40Z","lastTransitionTime":"2025-12-01T09:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.641592 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.641651 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.641677 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.641705 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.641723 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:40Z","lastTransitionTime":"2025-12-01T09:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.672450 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.672505 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.672528 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.672559 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.672582 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:40Z","lastTransitionTime":"2025-12-01T09:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:40 crc kubenswrapper[4867]: E1201 09:08:40.692195 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.697404 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.697459 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.697473 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.697493 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.697509 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:40Z","lastTransitionTime":"2025-12-01T09:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:40 crc kubenswrapper[4867]: E1201 09:08:40.712768 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.718219 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.718263 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.718281 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.718306 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.718324 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:40Z","lastTransitionTime":"2025-12-01T09:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:40 crc kubenswrapper[4867]: E1201 09:08:40.738116 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.742410 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.742463 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.742481 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.742503 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.742521 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:40Z","lastTransitionTime":"2025-12-01T09:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:40 crc kubenswrapper[4867]: E1201 09:08:40.763353 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.768321 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.768351 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.768362 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.768378 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.768389 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:40Z","lastTransitionTime":"2025-12-01T09:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:40 crc kubenswrapper[4867]: E1201 09:08:40.787356 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:40 crc kubenswrapper[4867]: E1201 09:08:40.787700 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.789621 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.789651 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.789662 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.789680 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.789695 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:40Z","lastTransitionTime":"2025-12-01T09:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.896374 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.896551 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.896567 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.896582 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.896594 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:40Z","lastTransitionTime":"2025-12-01T09:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.999349 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.999410 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.999432 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.999458 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:40 crc kubenswrapper[4867]: I1201 09:08:40.999479 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:40Z","lastTransitionTime":"2025-12-01T09:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.101610 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.102063 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.102282 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.102478 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.102620 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:41Z","lastTransitionTime":"2025-12-01T09:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.205256 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.205329 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.205351 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.205381 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.205406 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:41Z","lastTransitionTime":"2025-12-01T09:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.307596 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.307628 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.307638 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.307649 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.307658 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:41Z","lastTransitionTime":"2025-12-01T09:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.410522 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.410575 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.410592 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.410655 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.410672 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:41Z","lastTransitionTime":"2025-12-01T09:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.514204 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.514246 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.514258 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.514274 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.514286 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:41Z","lastTransitionTime":"2025-12-01T09:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.617540 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.617614 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.617632 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.617656 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.617674 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:41Z","lastTransitionTime":"2025-12-01T09:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.720686 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.720737 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.720754 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.720777 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.720795 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:41Z","lastTransitionTime":"2025-12-01T09:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.822693 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.822732 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.822743 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.822759 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.822787 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:41Z","lastTransitionTime":"2025-12-01T09:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.826769 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.826851 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.826933 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:41 crc kubenswrapper[4867]: E1201 09:08:41.826935 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.826978 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:41 crc kubenswrapper[4867]: E1201 09:08:41.827059 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:41 crc kubenswrapper[4867]: E1201 09:08:41.827128 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:41 crc kubenswrapper[4867]: E1201 09:08:41.827183 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.925446 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.925486 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.925507 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.925526 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:41 crc kubenswrapper[4867]: I1201 09:08:41.925568 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:41Z","lastTransitionTime":"2025-12-01T09:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.028634 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.028675 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.028690 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.028710 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.028725 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:42Z","lastTransitionTime":"2025-12-01T09:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.131407 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.131458 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.131469 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.131486 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.131498 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:42Z","lastTransitionTime":"2025-12-01T09:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.233482 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.233525 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.233536 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.233553 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.233563 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:42Z","lastTransitionTime":"2025-12-01T09:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.335935 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.335968 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.335977 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.335991 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.335999 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:42Z","lastTransitionTime":"2025-12-01T09:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.439398 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.439480 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.439503 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.439531 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.439550 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:42Z","lastTransitionTime":"2025-12-01T09:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.542354 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.542416 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.542433 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.542458 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.542477 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:42Z","lastTransitionTime":"2025-12-01T09:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.645956 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.645989 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.645999 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.646047 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.646081 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:42Z","lastTransitionTime":"2025-12-01T09:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.749174 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.749230 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.749241 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.749260 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.749271 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:42Z","lastTransitionTime":"2025-12-01T09:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.852730 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.852789 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.852806 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.852859 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.852874 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:42Z","lastTransitionTime":"2025-12-01T09:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.955138 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.955419 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.955507 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.955638 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:42 crc kubenswrapper[4867]: I1201 09:08:42.955720 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:42Z","lastTransitionTime":"2025-12-01T09:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.057989 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.058047 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.058059 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.058077 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.058090 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:43Z","lastTransitionTime":"2025-12-01T09:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.161178 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.161244 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.161267 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.161300 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.161323 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:43Z","lastTransitionTime":"2025-12-01T09:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.267192 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.267278 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.267298 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.267340 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.267361 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:43Z","lastTransitionTime":"2025-12-01T09:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.370466 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.370497 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.370505 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.370517 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.370525 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:43Z","lastTransitionTime":"2025-12-01T09:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.473119 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.473152 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.473161 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.473176 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.473189 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:43Z","lastTransitionTime":"2025-12-01T09:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.575887 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.575946 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.575970 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.575997 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.576019 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:43Z","lastTransitionTime":"2025-12-01T09:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.678267 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.678316 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.678325 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.678337 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.678345 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:43Z","lastTransitionTime":"2025-12-01T09:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.780469 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.780517 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.780556 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.780573 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.780584 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:43Z","lastTransitionTime":"2025-12-01T09:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.826423 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.826463 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.826495 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:43 crc kubenswrapper[4867]: E1201 09:08:43.826672 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.826705 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:43 crc kubenswrapper[4867]: E1201 09:08:43.826887 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:43 crc kubenswrapper[4867]: E1201 09:08:43.827078 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:43 crc kubenswrapper[4867]: E1201 09:08:43.827231 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.883073 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.883144 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.883169 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.883196 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.883235 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:43Z","lastTransitionTime":"2025-12-01T09:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.986212 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.986274 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.986297 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.986328 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:43 crc kubenswrapper[4867]: I1201 09:08:43.986351 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:43Z","lastTransitionTime":"2025-12-01T09:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.090439 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.090486 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.090500 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.090520 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.090535 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:44Z","lastTransitionTime":"2025-12-01T09:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.193702 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.194241 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.194425 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.194637 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.194879 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:44Z","lastTransitionTime":"2025-12-01T09:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.297269 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.297332 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.297343 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.297357 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.297368 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:44Z","lastTransitionTime":"2025-12-01T09:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.400138 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.400184 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.400194 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.400211 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.400221 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:44Z","lastTransitionTime":"2025-12-01T09:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.503692 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.503765 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.503786 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.503844 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.503871 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:44Z","lastTransitionTime":"2025-12-01T09:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.606736 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.606783 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.606795 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.606834 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.606847 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:44Z","lastTransitionTime":"2025-12-01T09:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.709203 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.709244 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.709259 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.709278 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.709292 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:44Z","lastTransitionTime":"2025-12-01T09:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.812365 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.812419 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.812434 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.812455 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.812473 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:44Z","lastTransitionTime":"2025-12-01T09:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.915717 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.915974 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.916058 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.916125 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:44 crc kubenswrapper[4867]: I1201 09:08:44.916199 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:44Z","lastTransitionTime":"2025-12-01T09:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.019384 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.019419 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.019431 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.019446 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.019456 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:45Z","lastTransitionTime":"2025-12-01T09:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.120925 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.120979 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.120991 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.121008 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.121017 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:45Z","lastTransitionTime":"2025-12-01T09:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.222873 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.222911 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.222922 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.222958 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.222969 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:45Z","lastTransitionTime":"2025-12-01T09:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.324786 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.324851 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.324864 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.324876 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.324891 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:45Z","lastTransitionTime":"2025-12-01T09:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.427946 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.428000 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.428020 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.428045 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.428080 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:45Z","lastTransitionTime":"2025-12-01T09:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.531769 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.531846 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.531868 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.531914 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.531941 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:45Z","lastTransitionTime":"2025-12-01T09:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.634383 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.634442 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.634460 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.634483 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.634501 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:45Z","lastTransitionTime":"2025-12-01T09:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.737142 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.737239 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.737256 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.737277 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.737294 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:45Z","lastTransitionTime":"2025-12-01T09:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.826465 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.826483 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.827119 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.827468 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:45 crc kubenswrapper[4867]: E1201 09:08:45.827587 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:45 crc kubenswrapper[4867]: E1201 09:08:45.827673 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:08:45 crc kubenswrapper[4867]: E1201 09:08:45.827877 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:45 crc kubenswrapper[4867]: E1201 09:08:45.827468 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.828086 4867 scope.go:117] "RemoveContainer" containerID="51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.840177 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.840455 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.840572 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.840685 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.840833 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:45Z","lastTransitionTime":"2025-12-01T09:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.943566 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.943977 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.943990 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.944008 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:45 crc kubenswrapper[4867]: I1201 09:08:45.944021 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:45Z","lastTransitionTime":"2025-12-01T09:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.047920 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.047969 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.047988 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.048010 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.048027 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:46Z","lastTransitionTime":"2025-12-01T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.115912 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kk2hn_8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/ovnkube-controller/1.log" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.117895 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerStarted","Data":"f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1"} Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.118016 4867 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.125236 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.131747 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.143509 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.150208 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.150238 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.150248 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.150259 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.150268 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:46Z","lastTransitionTime":"2025-12-01T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.171687 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.183709 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.202655 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.224593 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.247765 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"erver/kube-apiserver-crc in node crc\\\\nI1201 09:08:29.861944 6198 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1201 09:08:29.861948 6198 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1201 09:08:29.861954 6198 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF1201 09:08:29.861838 6198 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z]\\\\nI1201 09:08:29.861959 6198 obj_retry.go:365] Adding \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.253254 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.253298 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.253309 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.253329 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.253360 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:46Z","lastTransitionTime":"2025-12-01T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.267530 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.278915 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.290029 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.299518 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.309979 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.319207 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.329007 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b0a64a-1b1c-49e2-9715-29505b2c124b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5c06d9135265045d53f83c60fee67d6c6a6a029ca6c6f1d87f0db56f7f5e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af8ef02ae3c8d182752ab2174f0f96fb7a2ddc2652364fd79f9cfe6ab7bcb248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zd2r2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.340865 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.354918 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.355114 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.355127 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.355134 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.355147 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.355155 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:46Z","lastTransitionTime":"2025-12-01T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.364227 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.457799 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.457863 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.457876 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.457896 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.457910 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:46Z","lastTransitionTime":"2025-12-01T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.560680 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.560723 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.560750 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.560764 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.560774 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:46Z","lastTransitionTime":"2025-12-01T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.663130 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.663168 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.663178 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.663194 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.663204 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:46Z","lastTransitionTime":"2025-12-01T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.765688 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.765726 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.765735 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.765748 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.765757 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:46Z","lastTransitionTime":"2025-12-01T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.867648 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.867905 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.867967 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.868028 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.868103 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:46Z","lastTransitionTime":"2025-12-01T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.970116 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.970153 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.970163 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.970176 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:46 crc kubenswrapper[4867]: I1201 09:08:46.970185 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:46Z","lastTransitionTime":"2025-12-01T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.072738 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.072777 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.072788 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.072803 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.072876 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:47Z","lastTransitionTime":"2025-12-01T09:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.122458 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kk2hn_8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/ovnkube-controller/2.log" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.123280 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kk2hn_8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/ovnkube-controller/1.log" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.125974 4867 generic.go:334] "Generic (PLEG): container finished" podID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerID="f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1" exitCode=1 Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.126102 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerDied","Data":"f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1"} Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.126219 4867 scope.go:117] "RemoveContainer" containerID="51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.126959 4867 scope.go:117] "RemoveContainer" containerID="f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1" Dec 01 09:08:47 crc kubenswrapper[4867]: E1201 09:08:47.127189 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.147967 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.158620 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.167821 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.175270 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.175316 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.175329 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.175347 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.175358 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:47Z","lastTransitionTime":"2025-12-01T09:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.179615 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.194574 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.208351 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.226790 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.239221 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.251652 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.261444 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.278183 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.278215 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.278225 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.278242 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.278254 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:47Z","lastTransitionTime":"2025-12-01T09:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.279090 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e86e7d1a7556927be01ed5b8553368a2dc63204ae9b855df0d30ea845b9840\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"message\\\":\\\"erver/kube-apiserver-crc in node crc\\\\nI1201 09:08:29.861944 6198 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1201 09:08:29.861948 6198 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1201 09:08:29.861954 6198 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF1201 09:08:29.861838 6198 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:29Z is after 2025-08-24T17:21:41Z]\\\\nI1201 09:08:29.861959 6198 obj_retry.go:365] Adding \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:47Z\\\",\\\"message\\\":\\\"control-plane-749d76644c-zd2r2 in node crc\\\\nI1201 09:08:46.620689 6409 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2 after 0 failed attempt(s)\\\\nI1201 09:08:46.620695 6409 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2\\\\nF1201 09:08:46.620697 6409 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z]\\\\nI1201 09:08:46.620179 6409 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-g6dw4 in node crc\\\\nI1201 09:08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.293603 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.309658 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.323403 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b0a64a-1b1c-49e2-9715-29505b2c124b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5c06d9135265045d53f83c60fee67d6c6a6a029ca6c6f1d87f0db56f7f5e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af8ef02ae3c8d182752ab2174f0f96fb7a2ddc2652364fd79f9cfe6ab7bcb248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zd2r2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.336368 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.350122 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.365079 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.381738 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.381836 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.381851 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.381875 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.381891 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:47Z","lastTransitionTime":"2025-12-01T09:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.484544 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.484606 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.484649 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.484669 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.484684 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:47Z","lastTransitionTime":"2025-12-01T09:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.588322 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.588386 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.588402 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.588429 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.588445 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:47Z","lastTransitionTime":"2025-12-01T09:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.654155 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs\") pod \"network-metrics-daemon-n7wvd\" (UID: \"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\") " pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:47 crc kubenswrapper[4867]: E1201 09:08:47.654387 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:08:47 crc kubenswrapper[4867]: E1201 09:08:47.654502 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs podName:c3ff1be1-b98b-483b-83ca-eb2255f66c7c nodeName:}" failed. No retries permitted until 2025-12-01 09:09:03.654473654 +0000 UTC m=+65.113860418 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs") pod "network-metrics-daemon-n7wvd" (UID: "c3ff1be1-b98b-483b-83ca-eb2255f66c7c") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.691521 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.691576 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.691588 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.691606 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.691619 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:47Z","lastTransitionTime":"2025-12-01T09:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.794459 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.794515 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.794528 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.794545 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.794558 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:47Z","lastTransitionTime":"2025-12-01T09:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.825912 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.825973 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:47 crc kubenswrapper[4867]: E1201 09:08:47.826063 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.826090 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.826099 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:47 crc kubenswrapper[4867]: E1201 09:08:47.826243 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:47 crc kubenswrapper[4867]: E1201 09:08:47.826297 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:47 crc kubenswrapper[4867]: E1201 09:08:47.826367 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.897491 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.897524 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.897535 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.897550 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:47 crc kubenswrapper[4867]: I1201 09:08:47.897561 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:47Z","lastTransitionTime":"2025-12-01T09:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.001329 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.001404 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.001420 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.001441 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.001456 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:48Z","lastTransitionTime":"2025-12-01T09:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.104321 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.104376 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.104388 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.104406 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.104418 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:48Z","lastTransitionTime":"2025-12-01T09:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.135753 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kk2hn_8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/ovnkube-controller/2.log" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.140748 4867 scope.go:117] "RemoveContainer" containerID="f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1" Dec 01 09:08:48 crc kubenswrapper[4867]: E1201 09:08:48.141025 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.159473 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.173101 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.195529 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:47Z\\\",\\\"message\\\":\\\"control-plane-749d76644c-zd2r2 in node crc\\\\nI1201 09:08:46.620689 6409 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2 after 0 failed attempt(s)\\\\nI1201 09:08:46.620695 6409 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2\\\\nF1201 09:08:46.620697 6409 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z]\\\\nI1201 09:08:46.620179 6409 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-g6dw4 in node crc\\\\nI1201 09:08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.207312 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.207351 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.207973 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.208000 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.208014 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:48Z","lastTransitionTime":"2025-12-01T09:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.209427 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.219686 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.232362 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b0a64a-1b1c-49e2-9715-29505b2c124b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5c06d9135265045d53f83c60fee67d6c6a6a029ca6c6f1d87f0db56f7f5e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af8ef02ae3c8d182752ab2174f0f96fb7a2ddc2652364fd79f9cfe6ab7bcb248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zd2r2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.246951 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.260539 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.270673 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.282176 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.294112 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.306390 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.309903 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.309952 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.309963 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.309980 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.309991 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:48Z","lastTransitionTime":"2025-12-01T09:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.318322 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.333759 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.346742 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.365054 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.378338 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.379590 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.388586 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.394460 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.408255 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.411967 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.411999 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.412009 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.412023 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.412033 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:48Z","lastTransitionTime":"2025-12-01T09:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.422930 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.454260 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.467152 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.483553 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.495820 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.513056 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.513774 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.513931 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.514115 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.514212 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.514290 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:48Z","lastTransitionTime":"2025-12-01T09:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.525360 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.540250 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.552291 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.564603 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.584672 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:47Z\\\",\\\"message\\\":\\\"control-plane-749d76644c-zd2r2 in node crc\\\\nI1201 09:08:46.620689 6409 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2 after 0 failed attempt(s)\\\\nI1201 09:08:46.620695 6409 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2\\\\nF1201 09:08:46.620697 6409 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z]\\\\nI1201 09:08:46.620179 6409 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-g6dw4 in node crc\\\\nI1201 09:08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.597840 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.610076 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.616694 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.616719 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.616728 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.616743 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.616753 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:48Z","lastTransitionTime":"2025-12-01T09:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.621951 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b0a64a-1b1c-49e2-9715-29505b2c124b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5c06d9135265045d53f83c60fee67d6c6a6a029ca6c6f1d87f0db56f7f5e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af8ef02ae3c8d182752ab2174f0f96fb7a2ddc2652364fd79f9cfe6ab7bcb248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zd2r2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.637274 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.719394 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.719431 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.719440 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.719453 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.719462 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:48Z","lastTransitionTime":"2025-12-01T09:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.821976 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.822025 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.822044 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.822067 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.822084 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:48Z","lastTransitionTime":"2025-12-01T09:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.845909 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.859340 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.870578 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.886538 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.908684 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.920909 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.924893 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.924935 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.924948 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.924966 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.924979 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:48Z","lastTransitionTime":"2025-12-01T09:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.936024 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.950373 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.965467 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.981308 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:48 crc kubenswrapper[4867]: I1201 09:08:48.991626 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdd80a4-08b4-45b6-9559-e4ba382f17d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb7508abad81d77278c9f93d6a866ef80f310a48a179dff68d8d630c5a5cb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0116ddbcf9fa956327663d364bab7e5374fc92b182321bdc348b4af4ac47d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdea1a5a16c58da916f9b5af087a90ba2d498d7fb4f405ad1c53f9da05d0e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f745093342307a14715256ab35747f2570ae845e9374376cf169da702608dfa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f745093342307a14715256ab35747f2570ae845e9374376cf169da702608dfa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.011413 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.020317 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.027052 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.027084 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.027092 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.027106 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.027115 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:49Z","lastTransitionTime":"2025-12-01T09:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.040106 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:47Z\\\",\\\"message\\\":\\\"control-plane-749d76644c-zd2r2 in node crc\\\\nI1201 09:08:46.620689 6409 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2 after 0 failed attempt(s)\\\\nI1201 09:08:46.620695 6409 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2\\\\nF1201 09:08:46.620697 6409 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z]\\\\nI1201 09:08:46.620179 6409 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-g6dw4 in node crc\\\\nI1201 09:08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.050756 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.059893 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.069750 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b0a64a-1b1c-49e2-9715-29505b2c124b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5c06d9135265045d53f83c60fee67d6c6a6a029ca6c6f1d87f0db56f7f5e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af8ef02ae3c8d182752ab2174f0f96fb7a2ddc2652364fd79f9cfe6ab7bcb248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zd2r2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.078672 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.129067 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.129106 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.129117 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.129130 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.129142 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:49Z","lastTransitionTime":"2025-12-01T09:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.231684 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.232000 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.232066 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.232136 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.232199 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:49Z","lastTransitionTime":"2025-12-01T09:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.334567 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.334609 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.334619 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.334633 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.334642 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:49Z","lastTransitionTime":"2025-12-01T09:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.437256 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.437289 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.437297 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.437310 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.437319 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:49Z","lastTransitionTime":"2025-12-01T09:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.539213 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.539262 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.539272 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.539287 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.539296 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:49Z","lastTransitionTime":"2025-12-01T09:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.571716 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.571966 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.572020 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.572058 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:49 crc kubenswrapper[4867]: E1201 09:08:49.572180 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:08:49 crc kubenswrapper[4867]: E1201 09:08:49.572276 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:09:21.572253392 +0000 UTC m=+83.031640166 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:08:49 crc kubenswrapper[4867]: E1201 09:08:49.572287 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:08:49 crc kubenswrapper[4867]: E1201 09:08:49.572307 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:09:21.572297584 +0000 UTC m=+83.031684358 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:08:49 crc kubenswrapper[4867]: E1201 09:08:49.572394 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:09:21.572373506 +0000 UTC m=+83.031760260 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:08:49 crc kubenswrapper[4867]: E1201 09:08:49.572492 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:08:49 crc kubenswrapper[4867]: E1201 09:08:49.572572 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:08:49 crc kubenswrapper[4867]: E1201 09:08:49.572674 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:49 crc kubenswrapper[4867]: E1201 09:08:49.572796 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:09:21.572769856 +0000 UTC m=+83.032156610 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.641700 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.642263 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.642466 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.642639 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.642774 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:49Z","lastTransitionTime":"2025-12-01T09:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.673367 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:49 crc kubenswrapper[4867]: E1201 09:08:49.673844 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:08:49 crc kubenswrapper[4867]: E1201 09:08:49.673963 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:08:49 crc kubenswrapper[4867]: E1201 09:08:49.674176 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:49 crc kubenswrapper[4867]: E1201 09:08:49.674330 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:09:21.674309089 +0000 UTC m=+83.133695903 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.744664 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.744700 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.744710 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.744726 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.744736 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:49Z","lastTransitionTime":"2025-12-01T09:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.826947 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.826979 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.827041 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:49 crc kubenswrapper[4867]: E1201 09:08:49.827098 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.826947 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:49 crc kubenswrapper[4867]: E1201 09:08:49.827175 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:08:49 crc kubenswrapper[4867]: E1201 09:08:49.827218 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:49 crc kubenswrapper[4867]: E1201 09:08:49.827254 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.846733 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.846776 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.846785 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.846800 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.846830 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:49Z","lastTransitionTime":"2025-12-01T09:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.949339 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.949388 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.949400 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.949417 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:49 crc kubenswrapper[4867]: I1201 09:08:49.949429 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:49Z","lastTransitionTime":"2025-12-01T09:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.052743 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.052799 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.052850 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.052874 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.052896 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:50Z","lastTransitionTime":"2025-12-01T09:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.155307 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.155370 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.155378 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.155390 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.155400 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:50Z","lastTransitionTime":"2025-12-01T09:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.257368 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.257427 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.257441 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.257461 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.257476 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:50Z","lastTransitionTime":"2025-12-01T09:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.359871 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.359908 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.359918 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.359932 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.359943 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:50Z","lastTransitionTime":"2025-12-01T09:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.462590 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.462633 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.462645 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.462662 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.462673 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:50Z","lastTransitionTime":"2025-12-01T09:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.564325 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.564352 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.564360 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.564373 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.564382 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:50Z","lastTransitionTime":"2025-12-01T09:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.667850 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.667891 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.667917 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.667936 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.667949 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:50Z","lastTransitionTime":"2025-12-01T09:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.770333 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.770378 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.770389 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.770405 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.770414 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:50Z","lastTransitionTime":"2025-12-01T09:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.872839 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.872876 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.872884 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.872904 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.872914 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:50Z","lastTransitionTime":"2025-12-01T09:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.975178 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.975209 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.975217 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.975230 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:50 crc kubenswrapper[4867]: I1201 09:08:50.975238 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:50Z","lastTransitionTime":"2025-12-01T09:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.078594 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.078665 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.078687 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.078717 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.078740 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:51Z","lastTransitionTime":"2025-12-01T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.175970 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.176022 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.176034 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.176052 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.176063 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:51Z","lastTransitionTime":"2025-12-01T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:51 crc kubenswrapper[4867]: E1201 09:08:51.189469 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.194157 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.194202 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.194217 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.194237 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.194251 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:51Z","lastTransitionTime":"2025-12-01T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:51 crc kubenswrapper[4867]: E1201 09:08:51.213698 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.217173 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.217253 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.217299 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.217321 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.217337 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:51Z","lastTransitionTime":"2025-12-01T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:51 crc kubenswrapper[4867]: E1201 09:08:51.233268 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.236596 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.236636 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.236650 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.236667 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.236679 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:51Z","lastTransitionTime":"2025-12-01T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:51 crc kubenswrapper[4867]: E1201 09:08:51.248863 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.252216 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.252257 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.252266 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.252302 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.252314 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:51Z","lastTransitionTime":"2025-12-01T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:51 crc kubenswrapper[4867]: E1201 09:08:51.262723 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:51 crc kubenswrapper[4867]: E1201 09:08:51.262902 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.264424 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.264475 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.264491 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.264512 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.264526 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:51Z","lastTransitionTime":"2025-12-01T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.366905 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.366944 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.366955 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.366970 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.366980 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:51Z","lastTransitionTime":"2025-12-01T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.469387 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.469418 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.469427 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.469441 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.469450 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:51Z","lastTransitionTime":"2025-12-01T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.572098 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.572165 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.572178 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.572194 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.572205 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:51Z","lastTransitionTime":"2025-12-01T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.675248 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.675297 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.675312 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.675332 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.675348 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:51Z","lastTransitionTime":"2025-12-01T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.778729 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.778782 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.778825 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.778844 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.778855 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:51Z","lastTransitionTime":"2025-12-01T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.826766 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.826782 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.826783 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.826908 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:51 crc kubenswrapper[4867]: E1201 09:08:51.827109 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:08:51 crc kubenswrapper[4867]: E1201 09:08:51.827375 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:51 crc kubenswrapper[4867]: E1201 09:08:51.827514 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:51 crc kubenswrapper[4867]: E1201 09:08:51.827715 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.881984 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.882038 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.882050 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.882067 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.882079 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:51Z","lastTransitionTime":"2025-12-01T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.984292 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.984598 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.984691 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.984779 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:51 crc kubenswrapper[4867]: I1201 09:08:51.984882 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:51Z","lastTransitionTime":"2025-12-01T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.086725 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.086775 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.086788 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.086804 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.086835 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:52Z","lastTransitionTime":"2025-12-01T09:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.189505 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.189585 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.189597 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.189614 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.189628 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:52Z","lastTransitionTime":"2025-12-01T09:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.292362 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.292412 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.292427 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.292448 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.292468 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:52Z","lastTransitionTime":"2025-12-01T09:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.395096 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.395156 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.395174 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.395193 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.395205 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:52Z","lastTransitionTime":"2025-12-01T09:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.497795 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.497851 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.497862 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.497879 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.497890 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:52Z","lastTransitionTime":"2025-12-01T09:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.600244 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.600290 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.600299 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.600313 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.600322 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:52Z","lastTransitionTime":"2025-12-01T09:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.702356 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.702384 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.702395 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.702407 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.702415 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:52Z","lastTransitionTime":"2025-12-01T09:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.804888 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.804936 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.804952 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.804974 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.804990 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:52Z","lastTransitionTime":"2025-12-01T09:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.908367 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.908424 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.908441 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.908463 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:52 crc kubenswrapper[4867]: I1201 09:08:52.908485 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:52Z","lastTransitionTime":"2025-12-01T09:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.010161 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.010207 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.010219 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.010235 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.010246 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:53Z","lastTransitionTime":"2025-12-01T09:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.112862 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.112917 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.112975 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.112999 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.113015 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:53Z","lastTransitionTime":"2025-12-01T09:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.215950 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.216193 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.216256 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.216322 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.216383 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:53Z","lastTransitionTime":"2025-12-01T09:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.319751 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.320138 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.320319 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.320474 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.320634 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:53Z","lastTransitionTime":"2025-12-01T09:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.423433 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.423713 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.423865 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.424023 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.424141 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:53Z","lastTransitionTime":"2025-12-01T09:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.534419 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.534452 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.534460 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.534476 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.534486 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:53Z","lastTransitionTime":"2025-12-01T09:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.636514 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.636876 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.636886 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.636902 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.636912 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:53Z","lastTransitionTime":"2025-12-01T09:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.739148 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.739216 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.739239 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.739266 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.739286 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:53Z","lastTransitionTime":"2025-12-01T09:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.826779 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.826784 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.826789 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.826794 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:53 crc kubenswrapper[4867]: E1201 09:08:53.826967 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:53 crc kubenswrapper[4867]: E1201 09:08:53.827087 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:53 crc kubenswrapper[4867]: E1201 09:08:53.827152 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:53 crc kubenswrapper[4867]: E1201 09:08:53.827211 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.841934 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.842296 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.842419 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.842539 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.842642 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:53Z","lastTransitionTime":"2025-12-01T09:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.945427 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.945469 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.945481 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.945497 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:53 crc kubenswrapper[4867]: I1201 09:08:53.945508 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:53Z","lastTransitionTime":"2025-12-01T09:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.047274 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.047502 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.047577 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.047686 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.047828 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:54Z","lastTransitionTime":"2025-12-01T09:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.150527 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.150793 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.150908 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.151003 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.151087 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:54Z","lastTransitionTime":"2025-12-01T09:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.253431 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.253661 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.253775 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.253873 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.253958 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:54Z","lastTransitionTime":"2025-12-01T09:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.356161 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.356403 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.356473 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.356532 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.356596 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:54Z","lastTransitionTime":"2025-12-01T09:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.458594 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.458925 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.459016 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.459104 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.459167 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:54Z","lastTransitionTime":"2025-12-01T09:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.561799 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.561881 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.561892 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.561909 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.561922 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:54Z","lastTransitionTime":"2025-12-01T09:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.664497 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.664555 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.664565 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.664579 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.664590 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:54Z","lastTransitionTime":"2025-12-01T09:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.766374 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.766433 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.766444 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.766459 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.766470 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:54Z","lastTransitionTime":"2025-12-01T09:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.868749 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.868836 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.868853 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.868866 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.868876 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:54Z","lastTransitionTime":"2025-12-01T09:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.971621 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.971678 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.971690 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.971706 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:54 crc kubenswrapper[4867]: I1201 09:08:54.971717 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:54Z","lastTransitionTime":"2025-12-01T09:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.074184 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.074220 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.074266 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.074280 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.074289 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:55Z","lastTransitionTime":"2025-12-01T09:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.176295 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.176330 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.176338 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.176351 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.176361 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:55Z","lastTransitionTime":"2025-12-01T09:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.278471 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.278517 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.278525 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.278539 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.278548 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:55Z","lastTransitionTime":"2025-12-01T09:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.383373 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.384030 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.384128 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.384163 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.384184 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:55Z","lastTransitionTime":"2025-12-01T09:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.487032 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.487123 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.487141 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.487168 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.487186 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:55Z","lastTransitionTime":"2025-12-01T09:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.588997 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.589037 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.589048 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.589063 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.589075 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:55Z","lastTransitionTime":"2025-12-01T09:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.691649 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.691699 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.691708 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.691724 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.691734 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:55Z","lastTransitionTime":"2025-12-01T09:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.794285 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.794319 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.794328 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.794343 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.794353 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:55Z","lastTransitionTime":"2025-12-01T09:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.826715 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.826738 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.826782 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:55 crc kubenswrapper[4867]: E1201 09:08:55.827358 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:55 crc kubenswrapper[4867]: E1201 09:08:55.827158 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:55 crc kubenswrapper[4867]: E1201 09:08:55.827278 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.826797 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:55 crc kubenswrapper[4867]: E1201 09:08:55.827444 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.896272 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.896334 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.896342 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.896355 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.896364 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:55Z","lastTransitionTime":"2025-12-01T09:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.998726 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.998767 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.998783 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.998847 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:55 crc kubenswrapper[4867]: I1201 09:08:55.998871 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:55Z","lastTransitionTime":"2025-12-01T09:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.101599 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.101639 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.101650 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.101665 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.101674 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:56Z","lastTransitionTime":"2025-12-01T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.204871 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.204934 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.204954 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.204979 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.205000 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:56Z","lastTransitionTime":"2025-12-01T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.307370 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.307419 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.307435 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.307457 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.307467 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:56Z","lastTransitionTime":"2025-12-01T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.409878 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.409932 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.409948 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.409970 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.409987 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:56Z","lastTransitionTime":"2025-12-01T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.512390 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.512609 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.512711 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.512859 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.512953 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:56Z","lastTransitionTime":"2025-12-01T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.615857 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.615883 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.615890 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.615902 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.615912 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:56Z","lastTransitionTime":"2025-12-01T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.717918 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.717983 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.717999 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.718020 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.718035 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:56Z","lastTransitionTime":"2025-12-01T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.820313 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.820777 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.820871 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.820936 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.820998 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:56Z","lastTransitionTime":"2025-12-01T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.923185 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.923445 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.923548 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.923649 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:56 crc kubenswrapper[4867]: I1201 09:08:56.923726 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:56Z","lastTransitionTime":"2025-12-01T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.026236 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.026305 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.026318 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.026334 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.026345 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:57Z","lastTransitionTime":"2025-12-01T09:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.129347 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.129601 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.129694 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.129774 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.129918 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:57Z","lastTransitionTime":"2025-12-01T09:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.232287 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.232571 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.232671 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.232782 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.232894 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:57Z","lastTransitionTime":"2025-12-01T09:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.335641 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.335674 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.335684 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.335698 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.335708 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:57Z","lastTransitionTime":"2025-12-01T09:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.437545 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.437778 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.437894 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.438004 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.438072 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:57Z","lastTransitionTime":"2025-12-01T09:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.540501 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.540540 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.540549 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.540564 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.540572 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:57Z","lastTransitionTime":"2025-12-01T09:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.642652 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.642910 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.643035 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.643125 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.643201 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:57Z","lastTransitionTime":"2025-12-01T09:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.744890 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.744931 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.744942 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.744955 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.744964 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:57Z","lastTransitionTime":"2025-12-01T09:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.826261 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:57 crc kubenswrapper[4867]: E1201 09:08:57.826461 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.826261 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:57 crc kubenswrapper[4867]: E1201 09:08:57.826559 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.826293 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:57 crc kubenswrapper[4867]: E1201 09:08:57.826637 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.826261 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:57 crc kubenswrapper[4867]: E1201 09:08:57.826730 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.848168 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.848207 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.848221 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.848240 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.848255 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:57Z","lastTransitionTime":"2025-12-01T09:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.950520 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.950569 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.950578 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.950592 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:57 crc kubenswrapper[4867]: I1201 09:08:57.950601 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:57Z","lastTransitionTime":"2025-12-01T09:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.053070 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.053115 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.053127 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.053142 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.053152 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:58Z","lastTransitionTime":"2025-12-01T09:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.155551 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.155589 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.155599 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.155613 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.155622 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:58Z","lastTransitionTime":"2025-12-01T09:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.257370 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.257437 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.257460 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.257487 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.257507 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:58Z","lastTransitionTime":"2025-12-01T09:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.361029 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.361072 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.361081 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.361095 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.361104 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:58Z","lastTransitionTime":"2025-12-01T09:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.463779 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.463843 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.463854 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.463867 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.463876 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:58Z","lastTransitionTime":"2025-12-01T09:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.565792 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.565845 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.565854 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.565869 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.565880 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:58Z","lastTransitionTime":"2025-12-01T09:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.667931 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.667974 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.667984 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.667999 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.668011 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:58Z","lastTransitionTime":"2025-12-01T09:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.770482 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.770547 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.770563 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.770583 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.770598 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:58Z","lastTransitionTime":"2025-12-01T09:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.846279 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.859306 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.872541 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.872606 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.872625 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.872642 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.872654 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:58Z","lastTransitionTime":"2025-12-01T09:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.882311 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.897794 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.911862 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.923369 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.947482 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:47Z\\\",\\\"message\\\":\\\"control-plane-749d76644c-zd2r2 in node crc\\\\nI1201 09:08:46.620689 6409 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2 after 0 failed attempt(s)\\\\nI1201 09:08:46.620695 6409 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2\\\\nF1201 09:08:46.620697 6409 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z]\\\\nI1201 09:08:46.620179 6409 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-g6dw4 in node crc\\\\nI1201 09:08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.965897 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.974894 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.974929 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.974939 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.974954 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.974963 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:58Z","lastTransitionTime":"2025-12-01T09:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.979454 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdd80a4-08b4-45b6-9559-e4ba382f17d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb7508abad81d77278c9f93d6a866ef80f310a48a179dff68d8d630c5a5cb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0116ddbcf9fa956327663d364bab7e5374fc92b182321bdc348b4af4ac47d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdea1a5a16c58da916f9b5af087a90ba2d498d7fb4f405ad1c53f9da05d0e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f745093342307a14715256ab35747f2570ae845e9374376cf169da702608dfa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f745093342307a14715256ab35747f2570ae845e9374376cf169da702608dfa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:58 crc kubenswrapper[4867]: I1201 09:08:58.998415 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.012670 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.027109 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.043476 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.051510 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.060105 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b0a64a-1b1c-49e2-9715-29505b2c124b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5c06d9135265045d53f83c60fee67d6c6a6a029ca6c6f1d87f0db56f7f5e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af8ef02ae3c8d182752ab2174f0f96fb7a2ddc2652364fd79f9cfe6ab7bcb248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zd2r2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.070449 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.077500 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.077532 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.077543 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.077571 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.077583 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:59Z","lastTransitionTime":"2025-12-01T09:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.079886 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.088632 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.179379 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.179410 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.179418 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.179433 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.179442 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:59Z","lastTransitionTime":"2025-12-01T09:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.281573 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.281605 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.281614 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.281627 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.281635 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:59Z","lastTransitionTime":"2025-12-01T09:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.384404 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.384461 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.384477 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.384498 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.384516 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:59Z","lastTransitionTime":"2025-12-01T09:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.487462 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.487501 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.487509 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.487526 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.487536 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:59Z","lastTransitionTime":"2025-12-01T09:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.589981 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.590027 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.590040 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.590098 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.590112 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:59Z","lastTransitionTime":"2025-12-01T09:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.692914 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.692988 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.693006 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.693036 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.693059 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:59Z","lastTransitionTime":"2025-12-01T09:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.794694 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.794723 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.794731 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.794743 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.794751 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:59Z","lastTransitionTime":"2025-12-01T09:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.826293 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.826344 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:08:59 crc kubenswrapper[4867]: E1201 09:08:59.826411 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:08:59 crc kubenswrapper[4867]: E1201 09:08:59.826512 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.826575 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:08:59 crc kubenswrapper[4867]: E1201 09:08:59.826616 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.826299 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:08:59 crc kubenswrapper[4867]: E1201 09:08:59.826673 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.896829 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.896873 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.896903 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.896919 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:08:59 crc kubenswrapper[4867]: I1201 09:08:59.896929 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:08:59Z","lastTransitionTime":"2025-12-01T09:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.001710 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.001756 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.001783 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.001804 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.001850 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:00Z","lastTransitionTime":"2025-12-01T09:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.104370 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.104406 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.104416 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.104430 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.104440 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:00Z","lastTransitionTime":"2025-12-01T09:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.206546 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.206589 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.206605 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.206625 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.206640 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:00Z","lastTransitionTime":"2025-12-01T09:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.309123 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.309163 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.309178 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.309197 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.309214 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:00Z","lastTransitionTime":"2025-12-01T09:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.411604 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.411670 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.411697 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.411724 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.411745 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:00Z","lastTransitionTime":"2025-12-01T09:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.513711 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.513774 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.513792 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.513860 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.513884 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:00Z","lastTransitionTime":"2025-12-01T09:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.616410 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.616454 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.616469 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.616491 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.616507 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:00Z","lastTransitionTime":"2025-12-01T09:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.718506 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.718551 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.718560 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.718574 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.718584 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:00Z","lastTransitionTime":"2025-12-01T09:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.821311 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.821360 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.821368 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.821383 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.821394 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:00Z","lastTransitionTime":"2025-12-01T09:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.923751 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.923798 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.923826 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.923840 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:00 crc kubenswrapper[4867]: I1201 09:09:00.923851 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:00Z","lastTransitionTime":"2025-12-01T09:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.026247 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.026290 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.026298 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.026312 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.026323 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:01Z","lastTransitionTime":"2025-12-01T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.129325 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.129401 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.129425 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.129458 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.129482 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:01Z","lastTransitionTime":"2025-12-01T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.231348 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.231396 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.231407 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.231422 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.231435 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:01Z","lastTransitionTime":"2025-12-01T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.334319 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.334379 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.334396 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.334417 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.334432 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:01Z","lastTransitionTime":"2025-12-01T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.437094 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.437141 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.437149 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.437164 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.437173 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:01Z","lastTransitionTime":"2025-12-01T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.518429 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.518486 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.518499 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.518516 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.518528 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:01Z","lastTransitionTime":"2025-12-01T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:01 crc kubenswrapper[4867]: E1201 09:09:01.531289 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.535399 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.535476 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.535499 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.535529 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.535550 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:01Z","lastTransitionTime":"2025-12-01T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:01 crc kubenswrapper[4867]: E1201 09:09:01.549694 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.553936 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.553970 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.553979 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.553992 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.554003 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:01Z","lastTransitionTime":"2025-12-01T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:01 crc kubenswrapper[4867]: E1201 09:09:01.566256 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.569371 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.569398 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.569408 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.569420 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.569429 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:01Z","lastTransitionTime":"2025-12-01T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:01 crc kubenswrapper[4867]: E1201 09:09:01.581071 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.584119 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.584171 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.584181 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.584194 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.584203 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:01Z","lastTransitionTime":"2025-12-01T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:01 crc kubenswrapper[4867]: E1201 09:09:01.596152 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:01 crc kubenswrapper[4867]: E1201 09:09:01.596273 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.602355 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.602380 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.602406 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.602419 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.602428 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:01Z","lastTransitionTime":"2025-12-01T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.704427 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.704471 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.704480 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.704494 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.704504 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:01Z","lastTransitionTime":"2025-12-01T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.807593 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.807676 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.807688 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.807711 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.807725 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:01Z","lastTransitionTime":"2025-12-01T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.826286 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.826344 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.826348 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.826399 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:01 crc kubenswrapper[4867]: E1201 09:09:01.826421 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:01 crc kubenswrapper[4867]: E1201 09:09:01.826567 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:01 crc kubenswrapper[4867]: E1201 09:09:01.826643 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:01 crc kubenswrapper[4867]: E1201 09:09:01.826888 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.910382 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.910429 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.910441 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.910456 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:01 crc kubenswrapper[4867]: I1201 09:09:01.910469 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:01Z","lastTransitionTime":"2025-12-01T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.013421 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.013473 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.013490 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.013515 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.013533 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:02Z","lastTransitionTime":"2025-12-01T09:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.117391 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.117479 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.117499 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.117549 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.117566 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:02Z","lastTransitionTime":"2025-12-01T09:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.220053 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.220088 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.220098 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.220113 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.220123 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:02Z","lastTransitionTime":"2025-12-01T09:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.322759 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.322795 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.322803 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.322833 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.322842 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:02Z","lastTransitionTime":"2025-12-01T09:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.425130 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.425181 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.425194 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.425213 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.425225 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:02Z","lastTransitionTime":"2025-12-01T09:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.526769 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.526803 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.526842 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.526856 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.526866 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:02Z","lastTransitionTime":"2025-12-01T09:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.629545 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.629616 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.629632 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.629653 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.630099 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:02Z","lastTransitionTime":"2025-12-01T09:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.733055 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.733095 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.733112 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.733135 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.733152 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:02Z","lastTransitionTime":"2025-12-01T09:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.834567 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.834600 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.834609 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.834622 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.834630 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:02Z","lastTransitionTime":"2025-12-01T09:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.936482 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.936523 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.936531 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.936546 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:02 crc kubenswrapper[4867]: I1201 09:09:02.936555 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:02Z","lastTransitionTime":"2025-12-01T09:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.038759 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.038789 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.038798 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.038831 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.038843 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:03Z","lastTransitionTime":"2025-12-01T09:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.141089 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.141129 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.141139 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.141156 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.141167 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:03Z","lastTransitionTime":"2025-12-01T09:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.243648 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.243683 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.243693 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.243706 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.243715 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:03Z","lastTransitionTime":"2025-12-01T09:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.346934 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.346991 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.347003 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.347023 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.347034 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:03Z","lastTransitionTime":"2025-12-01T09:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.449741 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.449783 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.449792 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.449805 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.449837 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:03Z","lastTransitionTime":"2025-12-01T09:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.552953 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.553282 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.553382 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.553481 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.553580 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:03Z","lastTransitionTime":"2025-12-01T09:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.656063 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.656102 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.656111 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.656127 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.656138 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:03Z","lastTransitionTime":"2025-12-01T09:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.729427 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs\") pod \"network-metrics-daemon-n7wvd\" (UID: \"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\") " pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:03 crc kubenswrapper[4867]: E1201 09:09:03.729551 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:09:03 crc kubenswrapper[4867]: E1201 09:09:03.729621 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs podName:c3ff1be1-b98b-483b-83ca-eb2255f66c7c nodeName:}" failed. No retries permitted until 2025-12-01 09:09:35.729602749 +0000 UTC m=+97.188989513 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs") pod "network-metrics-daemon-n7wvd" (UID: "c3ff1be1-b98b-483b-83ca-eb2255f66c7c") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.758230 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.758265 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.758301 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.758314 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.758322 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:03Z","lastTransitionTime":"2025-12-01T09:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.826392 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.826432 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:03 crc kubenswrapper[4867]: E1201 09:09:03.826518 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.826517 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:03 crc kubenswrapper[4867]: E1201 09:09:03.826981 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:03 crc kubenswrapper[4867]: E1201 09:09:03.827057 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.827137 4867 scope.go:117] "RemoveContainer" containerID="f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.827286 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:03 crc kubenswrapper[4867]: E1201 09:09:03.827393 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" Dec 01 09:09:03 crc kubenswrapper[4867]: E1201 09:09:03.827442 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.860629 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.860691 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.860704 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.860720 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.860914 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:03Z","lastTransitionTime":"2025-12-01T09:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.963386 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.963453 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.963470 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.963489 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:03 crc kubenswrapper[4867]: I1201 09:09:03.963504 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:03Z","lastTransitionTime":"2025-12-01T09:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.064936 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.064970 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.065001 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.065017 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.065027 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:04Z","lastTransitionTime":"2025-12-01T09:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.166621 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.166686 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.166699 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.166715 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.166725 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:04Z","lastTransitionTime":"2025-12-01T09:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.269241 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.269300 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.269309 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.269325 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.269349 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:04Z","lastTransitionTime":"2025-12-01T09:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.371345 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.371386 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.371396 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.371411 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.371422 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:04Z","lastTransitionTime":"2025-12-01T09:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.473289 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.473388 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.473448 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.473476 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.473497 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:04Z","lastTransitionTime":"2025-12-01T09:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.575519 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.575569 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.575580 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.575597 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.575609 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:04Z","lastTransitionTime":"2025-12-01T09:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.677712 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.677982 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.678043 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.678116 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.678179 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:04Z","lastTransitionTime":"2025-12-01T09:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.780091 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.780132 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.780143 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.780160 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.780171 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:04Z","lastTransitionTime":"2025-12-01T09:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.881730 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.881767 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.881777 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.881790 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.881799 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:04Z","lastTransitionTime":"2025-12-01T09:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.984077 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.984296 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.984364 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.984429 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:04 crc kubenswrapper[4867]: I1201 09:09:04.984488 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:04Z","lastTransitionTime":"2025-12-01T09:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.086882 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.087403 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.087470 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.087543 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.087605 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:05Z","lastTransitionTime":"2025-12-01T09:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.189454 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.189690 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.189756 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.189860 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.189927 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:05Z","lastTransitionTime":"2025-12-01T09:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.293890 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.293926 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.293939 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.293955 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.293966 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:05Z","lastTransitionTime":"2025-12-01T09:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.395895 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.395929 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.395938 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.395950 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.395958 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:05Z","lastTransitionTime":"2025-12-01T09:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.498111 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.498181 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.498195 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.498211 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.498222 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:05Z","lastTransitionTime":"2025-12-01T09:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.600508 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.600546 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.600556 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.600570 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.600581 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:05Z","lastTransitionTime":"2025-12-01T09:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.703834 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.703883 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.703896 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.703909 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.703918 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:05Z","lastTransitionTime":"2025-12-01T09:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.806682 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.807028 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.807040 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.807055 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.807067 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:05Z","lastTransitionTime":"2025-12-01T09:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.826264 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.826487 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:05 crc kubenswrapper[4867]: E1201 09:09:05.826598 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:05 crc kubenswrapper[4867]: E1201 09:09:05.826716 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.826341 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:05 crc kubenswrapper[4867]: E1201 09:09:05.826969 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.826386 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:05 crc kubenswrapper[4867]: E1201 09:09:05.827157 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.909172 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.909368 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.909433 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.909500 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:05 crc kubenswrapper[4867]: I1201 09:09:05.909560 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:05Z","lastTransitionTime":"2025-12-01T09:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.012414 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.012465 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.012477 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.012496 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.012506 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:06Z","lastTransitionTime":"2025-12-01T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.114978 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.115471 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.115547 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.115651 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.115727 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:06Z","lastTransitionTime":"2025-12-01T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.191041 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tj9fl_c813b7ba-4c04-44d0-9f3e-3e5f4897fb73/kube-multus/0.log" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.191408 4867 generic.go:334] "Generic (PLEG): container finished" podID="c813b7ba-4c04-44d0-9f3e-3e5f4897fb73" containerID="7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1" exitCode=1 Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.191525 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tj9fl" event={"ID":"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73","Type":"ContainerDied","Data":"7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1"} Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.192376 4867 scope.go:117] "RemoveContainer" containerID="7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.217121 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.218337 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.218374 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.218384 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.218397 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.218406 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:06Z","lastTransitionTime":"2025-12-01T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.230841 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.245505 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:09:05Z\\\",\\\"message\\\":\\\"2025-12-01T09:08:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8fd70de5-a30a-465d-88bc-3c46781b6acf\\\\n2025-12-01T09:08:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8fd70de5-a30a-465d-88bc-3c46781b6acf to /host/opt/cni/bin/\\\\n2025-12-01T09:08:20Z [verbose] multus-daemon started\\\\n2025-12-01T09:08:20Z [verbose] Readiness Indicator file check\\\\n2025-12-01T09:09:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.263871 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.282127 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.292578 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.303275 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.318889 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:47Z\\\",\\\"message\\\":\\\"control-plane-749d76644c-zd2r2 in node crc\\\\nI1201 09:08:46.620689 6409 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2 after 0 failed attempt(s)\\\\nI1201 09:08:46.620695 6409 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2\\\\nF1201 09:08:46.620697 6409 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z]\\\\nI1201 09:08:46.620179 6409 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-g6dw4 in node crc\\\\nI1201 09:08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.320410 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.320556 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.320630 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.320712 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.320794 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:06Z","lastTransitionTime":"2025-12-01T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.331909 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.343030 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdd80a4-08b4-45b6-9559-e4ba382f17d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb7508abad81d77278c9f93d6a866ef80f310a48a179dff68d8d630c5a5cb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0116ddbcf9fa956327663d364bab7e5374fc92b182321bdc348b4af4ac47d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdea1a5a16c58da916f9b5af087a90ba2d498d7fb4f405ad1c53f9da05d0e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f745093342307a14715256ab35747f2570ae845e9374376cf169da702608dfa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f745093342307a14715256ab35747f2570ae845e9374376cf169da702608dfa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.354425 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.367206 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b0a64a-1b1c-49e2-9715-29505b2c124b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5c06d9135265045d53f83c60fee67d6c6a6a029ca6c6f1d87f0db56f7f5e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af8ef02ae3c8d182752ab2174f0f96fb7a2ddc2652364fd79f9cfe6ab7bcb248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zd2r2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.377580 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.389014 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.397652 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.411174 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.423250 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.423324 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.423660 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.423800 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.423925 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.423938 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:06Z","lastTransitionTime":"2025-12-01T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.432929 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.525691 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.525721 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.525728 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.525743 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.525751 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:06Z","lastTransitionTime":"2025-12-01T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.627637 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.627906 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.628026 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.628121 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.628222 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:06Z","lastTransitionTime":"2025-12-01T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.730865 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.730904 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.730915 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.730931 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.730940 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:06Z","lastTransitionTime":"2025-12-01T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.833279 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.833318 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.833326 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.833339 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.833348 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:06Z","lastTransitionTime":"2025-12-01T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.935472 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.935714 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.935843 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.935917 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:06 crc kubenswrapper[4867]: I1201 09:09:06.935983 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:06Z","lastTransitionTime":"2025-12-01T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.039327 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.039369 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.039416 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.039441 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.039453 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:07Z","lastTransitionTime":"2025-12-01T09:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.142776 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.142870 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.142883 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.142900 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.142912 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:07Z","lastTransitionTime":"2025-12-01T09:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.194756 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tj9fl_c813b7ba-4c04-44d0-9f3e-3e5f4897fb73/kube-multus/0.log" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.194797 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tj9fl" event={"ID":"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73","Type":"ContainerStarted","Data":"5e6b9eb6ccecedc334e31540ce3540114856b189aa31e762086629439a0dab9b"} Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.207076 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.216157 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.227119 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b0a64a-1b1c-49e2-9715-29505b2c124b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5c06d9135265045d53f83c60fee67d6c6a6a029ca6c6f1d87f0db56f7f5e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af8ef02ae3c8d182752ab2174f0f96fb7a2ddc2652364fd79f9cfe6ab7bcb248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zd2r2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.238614 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.248598 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.248645 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.248657 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.248672 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.248704 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:07Z","lastTransitionTime":"2025-12-01T09:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.253859 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.267540 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.279828 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.301192 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.316131 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.329607 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.342209 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.354304 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.354344 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.354354 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.354368 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.354378 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:07Z","lastTransitionTime":"2025-12-01T09:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.362396 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.374485 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e6b9eb6ccecedc334e31540ce3540114856b189aa31e762086629439a0dab9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:09:05Z\\\",\\\"message\\\":\\\"2025-12-01T09:08:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8fd70de5-a30a-465d-88bc-3c46781b6acf\\\\n2025-12-01T09:08:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8fd70de5-a30a-465d-88bc-3c46781b6acf to /host/opt/cni/bin/\\\\n2025-12-01T09:08:20Z [verbose] multus-daemon started\\\\n2025-12-01T09:08:20Z [verbose] Readiness Indicator file check\\\\n2025-12-01T09:09:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:09:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.388174 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.398356 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdd80a4-08b4-45b6-9559-e4ba382f17d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb7508abad81d77278c9f93d6a866ef80f310a48a179dff68d8d630c5a5cb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0116ddbcf9fa956327663d364bab7e5374fc92b182321bdc348b4af4ac47d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdea1a5a16c58da916f9b5af087a90ba2d498d7fb4f405ad1c53f9da05d0e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f745093342307a14715256ab35747f2570ae845e9374376cf169da702608dfa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f745093342307a14715256ab35747f2570ae845e9374376cf169da702608dfa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.409600 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.419221 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.436519 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:47Z\\\",\\\"message\\\":\\\"control-plane-749d76644c-zd2r2 in node crc\\\\nI1201 09:08:46.620689 6409 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2 after 0 failed attempt(s)\\\\nI1201 09:08:46.620695 6409 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2\\\\nF1201 09:08:46.620697 6409 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z]\\\\nI1201 09:08:46.620179 6409 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-g6dw4 in node crc\\\\nI1201 09:08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.455982 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.456026 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.456034 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.456048 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.456059 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:07Z","lastTransitionTime":"2025-12-01T09:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.557900 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.557958 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.557973 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.557995 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.558010 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:07Z","lastTransitionTime":"2025-12-01T09:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.660389 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.660423 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.660431 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.660444 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.660453 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:07Z","lastTransitionTime":"2025-12-01T09:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.762881 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.762914 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.762922 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.762936 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.762949 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:07Z","lastTransitionTime":"2025-12-01T09:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.826073 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.826125 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.826131 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:07 crc kubenswrapper[4867]: E1201 09:09:07.826209 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.826214 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:07 crc kubenswrapper[4867]: E1201 09:09:07.826305 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:07 crc kubenswrapper[4867]: E1201 09:09:07.826436 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:07 crc kubenswrapper[4867]: E1201 09:09:07.826550 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.864935 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.864991 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.865001 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.865016 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.865026 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:07Z","lastTransitionTime":"2025-12-01T09:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.967485 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.967524 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.967535 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.967549 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:07 crc kubenswrapper[4867]: I1201 09:09:07.967559 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:07Z","lastTransitionTime":"2025-12-01T09:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.070055 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.070103 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.070117 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.070138 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.070151 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:08Z","lastTransitionTime":"2025-12-01T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.172252 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.172300 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.172313 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.172331 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.172344 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:08Z","lastTransitionTime":"2025-12-01T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.274454 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.274505 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.274516 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.274533 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.274545 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:08Z","lastTransitionTime":"2025-12-01T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.377031 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.377079 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.377092 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.377111 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.377125 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:08Z","lastTransitionTime":"2025-12-01T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.479061 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.479101 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.479111 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.479127 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.479138 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:08Z","lastTransitionTime":"2025-12-01T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.581149 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.581198 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.581206 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.581220 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.581229 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:08Z","lastTransitionTime":"2025-12-01T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.683040 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.683082 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.683094 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.683113 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.683124 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:08Z","lastTransitionTime":"2025-12-01T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.785771 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.785805 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.785846 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.785878 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.785887 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:08Z","lastTransitionTime":"2025-12-01T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.839285 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.851076 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.862068 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.872885 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b0a64a-1b1c-49e2-9715-29505b2c124b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5c06d9135265045d53f83c60fee67d6c6a6a029ca6c6f1d87f0db56f7f5e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af8ef02ae3c8d182752ab2174f0f96fb7a2ddc2652364fd79f9cfe6ab7bcb248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zd2r2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.884932 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.887765 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.887795 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.887804 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.887838 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.887850 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:08Z","lastTransitionTime":"2025-12-01T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.894827 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.903327 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.914252 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.924724 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e6b9eb6ccecedc334e31540ce3540114856b189aa31e762086629439a0dab9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:09:05Z\\\",\\\"message\\\":\\\"2025-12-01T09:08:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8fd70de5-a30a-465d-88bc-3c46781b6acf\\\\n2025-12-01T09:08:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8fd70de5-a30a-465d-88bc-3c46781b6acf to /host/opt/cni/bin/\\\\n2025-12-01T09:08:20Z [verbose] multus-daemon started\\\\n2025-12-01T09:08:20Z [verbose] Readiness Indicator file check\\\\n2025-12-01T09:09:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:09:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.944964 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.956071 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.966826 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.978460 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.989347 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.989383 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.989393 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.989409 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.989420 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:08Z","lastTransitionTime":"2025-12-01T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:08 crc kubenswrapper[4867]: I1201 09:09:08.993745 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:47Z\\\",\\\"message\\\":\\\"control-plane-749d76644c-zd2r2 in node crc\\\\nI1201 09:08:46.620689 6409 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2 after 0 failed attempt(s)\\\\nI1201 09:08:46.620695 6409 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2\\\\nF1201 09:08:46.620697 6409 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z]\\\\nI1201 09:08:46.620179 6409 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-g6dw4 in node crc\\\\nI1201 09:08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.010494 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:09Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.019487 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdd80a4-08b4-45b6-9559-e4ba382f17d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb7508abad81d77278c9f93d6a866ef80f310a48a179dff68d8d630c5a5cb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0116ddbcf9fa956327663d364bab7e5374fc92b182321bdc348b4af4ac47d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdea1a5a16c58da916f9b5af087a90ba2d498d7fb4f405ad1c53f9da05d0e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f745093342307a14715256ab35747f2570ae845e9374376cf169da702608dfa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f745093342307a14715256ab35747f2570ae845e9374376cf169da702608dfa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:09Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.031543 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:09Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.041637 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:09Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.091421 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.091457 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.091465 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.091478 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.091486 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:09Z","lastTransitionTime":"2025-12-01T09:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.193909 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.193954 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.193966 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.193981 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.193991 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:09Z","lastTransitionTime":"2025-12-01T09:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.296079 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.296119 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.296128 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.296142 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.296151 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:09Z","lastTransitionTime":"2025-12-01T09:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.398914 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.399089 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.399108 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.399132 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.399148 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:09Z","lastTransitionTime":"2025-12-01T09:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.501570 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.501615 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.501627 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.501645 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.501655 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:09Z","lastTransitionTime":"2025-12-01T09:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.604017 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.604057 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.604066 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.604080 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.604090 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:09Z","lastTransitionTime":"2025-12-01T09:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.706881 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.706925 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.706935 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.706949 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.706957 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:09Z","lastTransitionTime":"2025-12-01T09:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.809685 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.809727 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.809739 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.809754 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.809763 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:09Z","lastTransitionTime":"2025-12-01T09:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.826449 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.826467 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.826559 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:09 crc kubenswrapper[4867]: E1201 09:09:09.826690 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.826722 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:09 crc kubenswrapper[4867]: E1201 09:09:09.826781 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:09 crc kubenswrapper[4867]: E1201 09:09:09.826915 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:09 crc kubenswrapper[4867]: E1201 09:09:09.827022 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.911862 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.911909 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.912594 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.912619 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:09 crc kubenswrapper[4867]: I1201 09:09:09.912638 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:09Z","lastTransitionTime":"2025-12-01T09:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.014690 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.014728 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.014742 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.014759 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.014772 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:10Z","lastTransitionTime":"2025-12-01T09:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.117050 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.117086 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.117094 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.117108 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.117120 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:10Z","lastTransitionTime":"2025-12-01T09:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.220371 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.220403 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.220413 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.220428 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.220438 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:10Z","lastTransitionTime":"2025-12-01T09:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.322622 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.322660 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.322677 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.322694 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.322706 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:10Z","lastTransitionTime":"2025-12-01T09:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.425526 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.425565 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.425573 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.425587 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.425596 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:10Z","lastTransitionTime":"2025-12-01T09:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.527638 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.527694 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.527706 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.527724 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.527735 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:10Z","lastTransitionTime":"2025-12-01T09:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.629851 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.629894 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.630093 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.630111 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.630121 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:10Z","lastTransitionTime":"2025-12-01T09:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.732902 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.732940 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.732948 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.732967 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.732976 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:10Z","lastTransitionTime":"2025-12-01T09:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.834948 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.835276 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.835304 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.835328 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.835374 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:10Z","lastTransitionTime":"2025-12-01T09:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.937590 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.937624 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.937632 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.937646 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:10 crc kubenswrapper[4867]: I1201 09:09:10.937655 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:10Z","lastTransitionTime":"2025-12-01T09:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.039831 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.040051 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.040225 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.040371 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.040503 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:11Z","lastTransitionTime":"2025-12-01T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.142389 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.142626 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.142692 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.142750 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.142804 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:11Z","lastTransitionTime":"2025-12-01T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.244732 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.245218 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.245335 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.245411 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.245474 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:11Z","lastTransitionTime":"2025-12-01T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.348049 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.348284 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.348365 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.348457 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.348532 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:11Z","lastTransitionTime":"2025-12-01T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.450697 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.450777 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.450788 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.450804 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.450843 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:11Z","lastTransitionTime":"2025-12-01T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.552778 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.553032 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.553102 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.553166 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.553229 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:11Z","lastTransitionTime":"2025-12-01T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.655782 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.655872 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.655890 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.655914 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.655931 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:11Z","lastTransitionTime":"2025-12-01T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.718839 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.718884 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.718893 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.718909 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.718919 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:11Z","lastTransitionTime":"2025-12-01T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:11 crc kubenswrapper[4867]: E1201 09:09:11.735365 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:11Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.738973 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.739034 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.739058 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.739087 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.739108 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:11Z","lastTransitionTime":"2025-12-01T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:11 crc kubenswrapper[4867]: E1201 09:09:11.759397 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:11Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.764510 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.764557 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.764607 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.764633 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.764681 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:11Z","lastTransitionTime":"2025-12-01T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:11 crc kubenswrapper[4867]: E1201 09:09:11.781471 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:11Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.786457 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.786517 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.786536 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.786559 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.786577 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:11Z","lastTransitionTime":"2025-12-01T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:11 crc kubenswrapper[4867]: E1201 09:09:11.798853 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:11Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.806144 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.806175 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.806183 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.806197 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.806207 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:11Z","lastTransitionTime":"2025-12-01T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:11 crc kubenswrapper[4867]: E1201 09:09:11.820228 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:11Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:11 crc kubenswrapper[4867]: E1201 09:09:11.820612 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.822284 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.822325 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.822335 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.822348 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.822356 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:11Z","lastTransitionTime":"2025-12-01T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.826613 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.826647 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.826630 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:11 crc kubenswrapper[4867]: E1201 09:09:11.826719 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.826759 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:11 crc kubenswrapper[4867]: E1201 09:09:11.826865 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:11 crc kubenswrapper[4867]: E1201 09:09:11.826907 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:11 crc kubenswrapper[4867]: E1201 09:09:11.827001 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.924693 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.924729 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.924741 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.924756 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:11 crc kubenswrapper[4867]: I1201 09:09:11.924768 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:11Z","lastTransitionTime":"2025-12-01T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.026898 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.026932 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.026942 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.026960 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.026978 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:12Z","lastTransitionTime":"2025-12-01T09:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.130372 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.130459 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.130486 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.130528 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.130553 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:12Z","lastTransitionTime":"2025-12-01T09:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.233435 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.233553 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.233572 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.233606 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.233644 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:12Z","lastTransitionTime":"2025-12-01T09:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.336495 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.336531 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.336543 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.336559 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.336570 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:12Z","lastTransitionTime":"2025-12-01T09:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.438661 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.438726 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.438748 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.438776 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.438798 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:12Z","lastTransitionTime":"2025-12-01T09:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.542063 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.542102 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.542115 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.542131 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.542142 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:12Z","lastTransitionTime":"2025-12-01T09:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.644715 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.644777 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.644790 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.644839 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.644854 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:12Z","lastTransitionTime":"2025-12-01T09:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.747262 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.747313 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.747326 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.747348 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.747364 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:12Z","lastTransitionTime":"2025-12-01T09:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.849955 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.850016 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.850034 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.850058 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.850076 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:12Z","lastTransitionTime":"2025-12-01T09:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.953059 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.953497 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.953785 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.954025 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:12 crc kubenswrapper[4867]: I1201 09:09:12.954295 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:12Z","lastTransitionTime":"2025-12-01T09:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.057963 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.058329 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.058514 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.058626 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.058738 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:13Z","lastTransitionTime":"2025-12-01T09:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.162044 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.162108 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.162117 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.162132 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.162142 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:13Z","lastTransitionTime":"2025-12-01T09:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.264777 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.264879 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.264902 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.264929 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.264948 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:13Z","lastTransitionTime":"2025-12-01T09:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.367783 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.367892 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.367919 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.367948 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.367970 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:13Z","lastTransitionTime":"2025-12-01T09:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.470962 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.471472 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.471855 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.472216 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.472566 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:13Z","lastTransitionTime":"2025-12-01T09:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.576033 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.576088 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.576106 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.576128 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.576143 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:13Z","lastTransitionTime":"2025-12-01T09:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.679374 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.679759 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.679962 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.680161 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.680317 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:13Z","lastTransitionTime":"2025-12-01T09:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.782851 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.783124 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.783284 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.783407 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.783516 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:13Z","lastTransitionTime":"2025-12-01T09:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.826705 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.826723 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:13 crc kubenswrapper[4867]: E1201 09:09:13.827116 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.826848 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:13 crc kubenswrapper[4867]: E1201 09:09:13.827013 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.826871 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:13 crc kubenswrapper[4867]: E1201 09:09:13.827270 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:13 crc kubenswrapper[4867]: E1201 09:09:13.827333 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.886502 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.886564 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.886583 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.886611 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.886629 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:13Z","lastTransitionTime":"2025-12-01T09:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.989247 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.989296 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.989307 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.989325 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:13 crc kubenswrapper[4867]: I1201 09:09:13.989334 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:13Z","lastTransitionTime":"2025-12-01T09:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.092049 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.092120 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.092142 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.092170 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.092191 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:14Z","lastTransitionTime":"2025-12-01T09:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.194546 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.194579 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.194588 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.194601 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.194610 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:14Z","lastTransitionTime":"2025-12-01T09:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.296946 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.297259 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.297327 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.297533 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.297622 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:14Z","lastTransitionTime":"2025-12-01T09:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.400637 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.400687 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.400703 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.400722 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.400739 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:14Z","lastTransitionTime":"2025-12-01T09:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.502746 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.503010 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.503081 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.503149 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.503213 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:14Z","lastTransitionTime":"2025-12-01T09:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.606378 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.606441 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.606458 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.606479 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.606494 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:14Z","lastTransitionTime":"2025-12-01T09:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.708903 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.708951 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.708964 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.708981 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.708993 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:14Z","lastTransitionTime":"2025-12-01T09:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.811307 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.811340 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.811349 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.811361 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.811370 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:14Z","lastTransitionTime":"2025-12-01T09:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.914104 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.914387 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.914454 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.914513 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:14 crc kubenswrapper[4867]: I1201 09:09:14.914609 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:14Z","lastTransitionTime":"2025-12-01T09:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.017333 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.017611 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.017698 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.017783 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.017919 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:15Z","lastTransitionTime":"2025-12-01T09:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.120231 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.120308 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.120332 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.120362 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.120384 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:15Z","lastTransitionTime":"2025-12-01T09:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.223266 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.223347 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.223373 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.223399 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.223417 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:15Z","lastTransitionTime":"2025-12-01T09:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.325745 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.325786 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.325797 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.325835 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.325846 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:15Z","lastTransitionTime":"2025-12-01T09:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.428374 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.428410 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.428421 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.428438 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.428450 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:15Z","lastTransitionTime":"2025-12-01T09:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.530444 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.530729 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.530834 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.530937 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.531027 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:15Z","lastTransitionTime":"2025-12-01T09:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.633510 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.633578 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.633590 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.633605 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.633614 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:15Z","lastTransitionTime":"2025-12-01T09:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.736365 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.736405 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.736415 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.736430 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.736463 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:15Z","lastTransitionTime":"2025-12-01T09:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.826283 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:15 crc kubenswrapper[4867]: E1201 09:09:15.826467 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.826749 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:15 crc kubenswrapper[4867]: E1201 09:09:15.826938 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.827152 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:15 crc kubenswrapper[4867]: E1201 09:09:15.827249 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.827504 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:15 crc kubenswrapper[4867]: E1201 09:09:15.827649 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.839373 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.839417 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.839428 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.839449 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.839463 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:15Z","lastTransitionTime":"2025-12-01T09:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.941648 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.941700 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.941711 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.941727 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:15 crc kubenswrapper[4867]: I1201 09:09:15.941740 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:15Z","lastTransitionTime":"2025-12-01T09:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.044424 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.044451 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.044459 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.044471 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.044480 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:16Z","lastTransitionTime":"2025-12-01T09:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.147148 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.147202 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.147216 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.147236 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.147250 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:16Z","lastTransitionTime":"2025-12-01T09:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.248871 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.248909 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.248919 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.248932 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.248941 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:16Z","lastTransitionTime":"2025-12-01T09:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.352221 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.352624 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.352807 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.353152 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.353357 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:16Z","lastTransitionTime":"2025-12-01T09:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.455920 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.455960 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.455969 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.455983 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.455997 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:16Z","lastTransitionTime":"2025-12-01T09:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.559090 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.559133 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.559144 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.559160 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.559170 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:16Z","lastTransitionTime":"2025-12-01T09:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.661967 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.662011 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.662020 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.662034 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.662043 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:16Z","lastTransitionTime":"2025-12-01T09:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.764611 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.764669 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.764679 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.764693 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.764702 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:16Z","lastTransitionTime":"2025-12-01T09:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.867837 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.867903 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.867930 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.867951 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.867965 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:16Z","lastTransitionTime":"2025-12-01T09:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.969798 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.970075 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.970154 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.970279 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:16 crc kubenswrapper[4867]: I1201 09:09:16.970367 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:16Z","lastTransitionTime":"2025-12-01T09:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.072396 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.072426 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.072434 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.072445 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.072453 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:17Z","lastTransitionTime":"2025-12-01T09:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.175933 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.175970 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.175978 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.175992 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.176002 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:17Z","lastTransitionTime":"2025-12-01T09:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.278256 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.278307 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.278319 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.278336 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.278349 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:17Z","lastTransitionTime":"2025-12-01T09:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.380237 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.380277 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.380288 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.380303 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.380314 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:17Z","lastTransitionTime":"2025-12-01T09:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.482554 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.482612 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.482626 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.482648 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.482661 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:17Z","lastTransitionTime":"2025-12-01T09:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.586255 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.586521 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.586608 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.586695 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.586784 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:17Z","lastTransitionTime":"2025-12-01T09:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.689851 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.690208 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.690404 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.690585 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.690737 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:17Z","lastTransitionTime":"2025-12-01T09:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.794501 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.794545 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.794556 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.794571 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.794583 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:17Z","lastTransitionTime":"2025-12-01T09:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.826096 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:17 crc kubenswrapper[4867]: E1201 09:09:17.826777 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.826905 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.826915 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.826923 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.827477 4867 scope.go:117] "RemoveContainer" containerID="f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1" Dec 01 09:09:17 crc kubenswrapper[4867]: E1201 09:09:17.827958 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:17 crc kubenswrapper[4867]: E1201 09:09:17.828058 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:17 crc kubenswrapper[4867]: E1201 09:09:17.828131 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.897472 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.897515 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.897533 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.897548 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:17 crc kubenswrapper[4867]: I1201 09:09:17.897569 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:17Z","lastTransitionTime":"2025-12-01T09:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.000078 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.000115 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.000123 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.000137 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.000147 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:18Z","lastTransitionTime":"2025-12-01T09:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.102321 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.102389 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.102407 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.102429 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.102447 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:18Z","lastTransitionTime":"2025-12-01T09:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.205067 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.205172 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.205192 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.205561 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.205778 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:18Z","lastTransitionTime":"2025-12-01T09:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.308958 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.309000 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.309014 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.309032 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.309044 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:18Z","lastTransitionTime":"2025-12-01T09:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.412015 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.412333 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.412345 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.412361 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.412374 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:18Z","lastTransitionTime":"2025-12-01T09:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.514614 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.515225 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.515309 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.515429 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.515527 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:18Z","lastTransitionTime":"2025-12-01T09:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.618677 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.619035 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.619124 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.619217 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.619297 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:18Z","lastTransitionTime":"2025-12-01T09:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.721903 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.721969 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.721981 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.721996 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.722008 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:18Z","lastTransitionTime":"2025-12-01T09:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.824662 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.824700 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.824722 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.824736 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.824745 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:18Z","lastTransitionTime":"2025-12-01T09:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.840033 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:18Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.850375 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:18Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.866122 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:18Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.878778 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:18Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.891062 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:18Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.902859 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:18Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.916869 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:18Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.926997 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.927227 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.927314 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.927421 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.927549 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:18Z","lastTransitionTime":"2025-12-01T09:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.931132 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e6b9eb6ccecedc334e31540ce3540114856b189aa31e762086629439a0dab9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:09:05Z\\\",\\\"message\\\":\\\"2025-12-01T09:08:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8fd70de5-a30a-465d-88bc-3c46781b6acf\\\\n2025-12-01T09:08:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8fd70de5-a30a-465d-88bc-3c46781b6acf to /host/opt/cni/bin/\\\\n2025-12-01T09:08:20Z [verbose] multus-daemon started\\\\n2025-12-01T09:08:20Z [verbose] Readiness Indicator file check\\\\n2025-12-01T09:09:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:09:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:18Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.950856 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:18Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.962928 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdd80a4-08b4-45b6-9559-e4ba382f17d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb7508abad81d77278c9f93d6a866ef80f310a48a179dff68d8d630c5a5cb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0116ddbcf9fa956327663d364bab7e5374fc92b182321bdc348b4af4ac47d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdea1a5a16c58da916f9b5af087a90ba2d498d7fb4f405ad1c53f9da05d0e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f745093342307a14715256ab35747f2570ae845e9374376cf169da702608dfa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f745093342307a14715256ab35747f2570ae845e9374376cf169da702608dfa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:18Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.974332 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:18Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:18 crc kubenswrapper[4867]: I1201 09:09:18.985786 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:18Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.003613 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:47Z\\\",\\\"message\\\":\\\"control-plane-749d76644c-zd2r2 in node crc\\\\nI1201 09:08:46.620689 6409 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2 after 0 failed attempt(s)\\\\nI1201 09:08:46.620695 6409 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2\\\\nF1201 09:08:46.620697 6409 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z]\\\\nI1201 09:08:46.620179 6409 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-g6dw4 in node crc\\\\nI1201 09:08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.019261 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.029841 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.030045 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.030136 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.030235 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.030325 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:19Z","lastTransitionTime":"2025-12-01T09:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.032179 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.041773 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.053375 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b0a64a-1b1c-49e2-9715-29505b2c124b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5c06d9135265045d53f83c60fee67d6c6a6a029ca6c6f1d87f0db56f7f5e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af8ef02ae3c8d182752ab2174f0f96fb7a2ddc2652364fd79f9cfe6ab7bcb248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zd2r2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.066533 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:19Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.133218 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.133268 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.133288 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.133341 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.133354 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:19Z","lastTransitionTime":"2025-12-01T09:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.236485 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.236570 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.236597 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.236630 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.236655 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:19Z","lastTransitionTime":"2025-12-01T09:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.339110 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.339152 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.339161 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.339175 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.339186 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:19Z","lastTransitionTime":"2025-12-01T09:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.440871 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.440915 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.440927 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.440943 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.440955 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:19Z","lastTransitionTime":"2025-12-01T09:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.543564 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.544416 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.544592 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.544793 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.545035 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:19Z","lastTransitionTime":"2025-12-01T09:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.646671 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.646711 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.646721 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.646733 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.646743 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:19Z","lastTransitionTime":"2025-12-01T09:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.749061 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.749096 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.749108 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.749123 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.749135 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:19Z","lastTransitionTime":"2025-12-01T09:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.826877 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.826877 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.826889 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.826898 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:19 crc kubenswrapper[4867]: E1201 09:09:19.827071 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:19 crc kubenswrapper[4867]: E1201 09:09:19.827158 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:19 crc kubenswrapper[4867]: E1201 09:09:19.827235 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:19 crc kubenswrapper[4867]: E1201 09:09:19.827371 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.851673 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.851713 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.851723 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.851738 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.851747 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:19Z","lastTransitionTime":"2025-12-01T09:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.954082 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.954150 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.954163 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.954179 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:19 crc kubenswrapper[4867]: I1201 09:09:19.954190 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:19Z","lastTransitionTime":"2025-12-01T09:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.056347 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.056390 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.056398 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.056412 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.056422 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:20Z","lastTransitionTime":"2025-12-01T09:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.159034 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.159066 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.159076 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.159090 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.159101 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:20Z","lastTransitionTime":"2025-12-01T09:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.243324 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kk2hn_8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/ovnkube-controller/2.log" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.246709 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerStarted","Data":"83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3"} Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.248030 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.261498 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.261534 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.261542 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.261557 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.261565 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:20Z","lastTransitionTime":"2025-12-01T09:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.264743 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e6b9eb6ccecedc334e31540ce3540114856b189aa31e762086629439a0dab9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:09:05Z\\\",\\\"message\\\":\\\"2025-12-01T09:08:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8fd70de5-a30a-465d-88bc-3c46781b6acf\\\\n2025-12-01T09:08:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8fd70de5-a30a-465d-88bc-3c46781b6acf to /host/opt/cni/bin/\\\\n2025-12-01T09:08:20Z [verbose] multus-daemon started\\\\n2025-12-01T09:08:20Z [verbose] Readiness Indicator file check\\\\n2025-12-01T09:09:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:09:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.285576 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.301196 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.312993 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.324566 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.340247 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.352894 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.363945 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.363985 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.363995 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.364014 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.364024 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:20Z","lastTransitionTime":"2025-12-01T09:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.364436 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdd80a4-08b4-45b6-9559-e4ba382f17d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb7508abad81d77278c9f93d6a866ef80f310a48a179dff68d8d630c5a5cb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0116ddbcf9fa956327663d364bab7e5374fc92b182321bdc348b4af4ac47d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdea1a5a16c58da916f9b5af087a90ba2d498d7fb4f405ad1c53f9da05d0e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f745093342307a14715256ab35747f2570ae845e9374376cf169da702608dfa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f745093342307a14715256ab35747f2570ae845e9374376cf169da702608dfa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.375094 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.386000 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.403985 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:47Z\\\",\\\"message\\\":\\\"control-plane-749d76644c-zd2r2 in node crc\\\\nI1201 09:08:46.620689 6409 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2 after 0 failed attempt(s)\\\\nI1201 09:08:46.620695 6409 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2\\\\nF1201 09:08:46.620697 6409 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z]\\\\nI1201 09:08:46.620179 6409 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-g6dw4 in node crc\\\\nI1201 09:08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:09:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.415459 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.425253 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.436354 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b0a64a-1b1c-49e2-9715-29505b2c124b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5c06d9135265045d53f83c60fee67d6c6a6a029ca6c6f1d87f0db56f7f5e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af8ef02ae3c8d182752ab2174f0f96fb7a2ddc2652364fd79f9cfe6ab7bcb248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zd2r2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.445985 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.457938 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.466321 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.466354 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.466362 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.466376 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.466385 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:20Z","lastTransitionTime":"2025-12-01T09:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.469332 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.484937 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:20Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.568508 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.568552 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.568561 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.568574 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.568583 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:20Z","lastTransitionTime":"2025-12-01T09:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.671231 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.671363 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.671486 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.671576 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.671637 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:20Z","lastTransitionTime":"2025-12-01T09:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.773786 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.773853 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.773865 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.773883 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.773895 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:20Z","lastTransitionTime":"2025-12-01T09:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.876349 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.876397 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.876407 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.876424 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.876434 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:20Z","lastTransitionTime":"2025-12-01T09:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.979120 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.979167 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.979178 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.979196 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:20 crc kubenswrapper[4867]: I1201 09:09:20.979207 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:20Z","lastTransitionTime":"2025-12-01T09:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.081898 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.081950 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.081963 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.081980 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.081992 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:21Z","lastTransitionTime":"2025-12-01T09:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.184826 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.186549 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.186581 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.186600 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.186612 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:21Z","lastTransitionTime":"2025-12-01T09:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.251127 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kk2hn_8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/ovnkube-controller/3.log" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.251676 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kk2hn_8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/ovnkube-controller/2.log" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.254374 4867 generic.go:334] "Generic (PLEG): container finished" podID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerID="83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3" exitCode=1 Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.254407 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerDied","Data":"83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3"} Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.254435 4867 scope.go:117] "RemoveContainer" containerID="f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.255031 4867 scope.go:117] "RemoveContainer" containerID="83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3" Dec 01 09:09:21 crc kubenswrapper[4867]: E1201 09:09:21.255198 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.267862 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.278458 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.288514 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.288673 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.288753 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.288920 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.289000 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:21Z","lastTransitionTime":"2025-12-01T09:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.291021 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b0a64a-1b1c-49e2-9715-29505b2c124b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5c06d9135265045d53f83c60fee67d6c6a6a029ca6c6f1d87f0db56f7f5e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af8ef02ae3c8d182752ab2174f0f96fb7a2ddc2652364fd79f9cfe6ab7bcb248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zd2r2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.302399 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.314497 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.323695 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.356108 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.388993 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.390408 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.390458 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.390469 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.390485 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.390497 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:21Z","lastTransitionTime":"2025-12-01T09:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.403374 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.417660 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.432522 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.445926 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e6b9eb6ccecedc334e31540ce3540114856b189aa31e762086629439a0dab9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:09:05Z\\\",\\\"message\\\":\\\"2025-12-01T09:08:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8fd70de5-a30a-465d-88bc-3c46781b6acf\\\\n2025-12-01T09:08:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8fd70de5-a30a-465d-88bc-3c46781b6acf to /host/opt/cni/bin/\\\\n2025-12-01T09:08:20Z [verbose] multus-daemon started\\\\n2025-12-01T09:08:20Z [verbose] Readiness Indicator file check\\\\n2025-12-01T09:09:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:09:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.465709 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.479246 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdd80a4-08b4-45b6-9559-e4ba382f17d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb7508abad81d77278c9f93d6a866ef80f310a48a179dff68d8d630c5a5cb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0116ddbcf9fa956327663d364bab7e5374fc92b182321bdc348b4af4ac47d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdea1a5a16c58da916f9b5af087a90ba2d498d7fb4f405ad1c53f9da05d0e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f745093342307a14715256ab35747f2570ae845e9374376cf169da702608dfa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f745093342307a14715256ab35747f2570ae845e9374376cf169da702608dfa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.491710 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.492721 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.492793 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.492805 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.492836 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.492849 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:21Z","lastTransitionTime":"2025-12-01T09:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.502360 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.520619 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4323d5d4c199022f17e067b18b0b9e2559324b1c2b6bac81e57b012dc0560a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:08:47Z\\\",\\\"message\\\":\\\"control-plane-749d76644c-zd2r2 in node crc\\\\nI1201 09:08:46.620689 6409 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2 after 0 failed attempt(s)\\\\nI1201 09:08:46.620695 6409 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2\\\\nF1201 09:08:46.620697 6409 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:08:46Z is after 2025-08-24T17:21:41Z]\\\\nI1201 09:08:46.620179 6409 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-g6dw4 in node crc\\\\nI1201 09:08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:09:20Z\\\",\\\"message\\\":\\\"ter 0 failed attempt(s)\\\\nI1201 09:09:20.640201 6801 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-g6dw4\\\\nI1201 09:09:20.640058 6801 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-tdw66\\\\nI1201 09:09:20.640211 6801 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-tdw66 in node crc\\\\nI1201 09:09:20.640216 6801 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-tdw66 after 0 failed attempt(s)\\\\nI1201 09:09:20.640221 6801 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-tdw66\\\\nI1201 09:09:20.640061 6801 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI1201 09:09:20.640283 6801 kube.go:317] Updating pod openshift-multus/network-metrics-daemon-n7wvd\\\\nI1201 09:09:20.640044 6801 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1201 09:09:20.640441 6801 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:09:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.534598 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:21Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.595418 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.595999 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.596083 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.596185 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.596273 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:21Z","lastTransitionTime":"2025-12-01T09:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.607331 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.607522 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.607570 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.607610 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:21 crc kubenswrapper[4867]: E1201 09:09:21.607775 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:09:21 crc kubenswrapper[4867]: E1201 09:09:21.607906 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:10:25.607885167 +0000 UTC m=+147.067271951 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:09:21 crc kubenswrapper[4867]: E1201 09:09:21.608030 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:09:21 crc kubenswrapper[4867]: E1201 09:09:21.608053 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:09:21 crc kubenswrapper[4867]: E1201 09:09:21.608067 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:09:21 crc kubenswrapper[4867]: E1201 09:09:21.608119 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:09:21 crc kubenswrapper[4867]: E1201 09:09:21.608127 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:25.608099863 +0000 UTC m=+147.067486657 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:09:21 crc kubenswrapper[4867]: E1201 09:09:21.608170 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:10:25.608152134 +0000 UTC m=+147.067538928 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:09:21 crc kubenswrapper[4867]: E1201 09:09:21.608213 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:10:25.608190865 +0000 UTC m=+147.067577649 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.698910 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.699190 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.699270 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.699364 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.699454 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:21Z","lastTransitionTime":"2025-12-01T09:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.708662 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:21 crc kubenswrapper[4867]: E1201 09:09:21.708879 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:09:21 crc kubenswrapper[4867]: E1201 09:09:21.708901 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:09:21 crc kubenswrapper[4867]: E1201 09:09:21.708915 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:09:21 crc kubenswrapper[4867]: E1201 09:09:21.708964 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:10:25.708948076 +0000 UTC m=+147.168334830 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.801928 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.801969 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.801980 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.801994 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.802006 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:21Z","lastTransitionTime":"2025-12-01T09:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.826639 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.826714 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.826724 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.826759 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:21 crc kubenswrapper[4867]: E1201 09:09:21.827123 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:21 crc kubenswrapper[4867]: E1201 09:09:21.827314 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:21 crc kubenswrapper[4867]: E1201 09:09:21.827388 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:21 crc kubenswrapper[4867]: E1201 09:09:21.827435 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.842240 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.904353 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.904418 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.904437 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.904462 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:21 crc kubenswrapper[4867]: I1201 09:09:21.904480 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:21Z","lastTransitionTime":"2025-12-01T09:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.007875 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.007959 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.007983 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.008014 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.008047 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:22Z","lastTransitionTime":"2025-12-01T09:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.111401 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.111470 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.111493 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.111525 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.111548 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:22Z","lastTransitionTime":"2025-12-01T09:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.123531 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.123571 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.123584 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.123600 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.123612 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:22Z","lastTransitionTime":"2025-12-01T09:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:22 crc kubenswrapper[4867]: E1201 09:09:22.143151 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.149623 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.149688 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.149707 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.149742 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.149760 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:22Z","lastTransitionTime":"2025-12-01T09:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:22 crc kubenswrapper[4867]: E1201 09:09:22.165399 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.169489 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.169539 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.169554 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.169573 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.169625 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:22Z","lastTransitionTime":"2025-12-01T09:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:22 crc kubenswrapper[4867]: E1201 09:09:22.187297 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.191840 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.191886 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.191918 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.191933 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.191942 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:22Z","lastTransitionTime":"2025-12-01T09:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:22 crc kubenswrapper[4867]: E1201 09:09:22.203509 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.207540 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.207592 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.207606 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.207626 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.207640 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:22Z","lastTransitionTime":"2025-12-01T09:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:22 crc kubenswrapper[4867]: E1201 09:09:22.220076 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8a65d7c2-3f9a-40e7-a739-7e76b1a2f333\\\",\\\"systemUUID\\\":\\\"6a9666c0-d065-46a2-bf0b-9da61e045701\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: E1201 09:09:22.220254 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.221504 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.221561 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.221580 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.221602 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.221619 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:22Z","lastTransitionTime":"2025-12-01T09:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.259007 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kk2hn_8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/ovnkube-controller/3.log" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.263393 4867 scope.go:117] "RemoveContainer" containerID="83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3" Dec 01 09:09:22 crc kubenswrapper[4867]: E1201 09:09:22.263707 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.276539 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://027b377e5536668fbe6ad4f286020f286cb324103fb838b3d9d79a2fa2afa647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.290683 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tdw66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7920004f-7b75-4925-8961-2629dc17ee30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://510bc4810977071edefc58aa2dd1abf62786c6fde355270f6f329f3216f62708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8hn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tdw66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.304294 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecffc3b166f146e7b07706c389f3fabbc7e2ab87455c7586290ec8054eea8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.317094 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"348089e9-b989-4676-8bd8-b42073339059\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6e6e0bd5e341af61c7188b355678b489b09f3e4a79f242e945e0c7dd3fe97e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aece4a0cb85514e34cedd122e7bb34d32cd3a5e05e1b12cee5ca9ca6c0e63772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c618a33144e95be37ba5028033d05d32ac0ed129b7199b30e3688ede59b77b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.324035 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.324074 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.324083 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.324097 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.324107 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:22Z","lastTransitionTime":"2025-12-01T09:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.330658 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e5768a469e958fa16d5bea4962cfc0567ce56741987fc0bb9f6fb529d2fda1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3f5320055b4a77593568306433b50646f08401c033970fd2cd80ec5c8960df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.342911 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.356426 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6494ebd3-57c2-4d65-b44a-3e30e76910a9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://238d82d0c82767dc774e94a4a2eb9ec63e76689b1366244d60e20c643cd3941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10efee7ab9ce256b5ea35b88f3dd02a4522df9ca178b115391586af9d77703cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b48c48a65ae1276f323495eacc67a1c72eb2489c77429a3da90541a2f7101a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5999b4711e6686339901bfc65eaeb4a1639110a97cdd6f2fdf07485b24b355d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e6cd27e9c9bcef718e718ac6eca79aca4c812eeb4d85279f94753e56cabb6b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbefa835d6054f71fc7981763e548bf5d5df568fe6171e83322b1db6ea6be17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399d12c6b849b1a25018933d98e40ff3702e93b9fd30e16fc53903f28e5b1ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddfxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g6dw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.372468 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tj9fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e6b9eb6ccecedc334e31540ce3540114856b189aa31e762086629439a0dab9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:09:05Z\\\",\\\"message\\\":\\\"2025-12-01T09:08:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8fd70de5-a30a-465d-88bc-3c46781b6acf\\\\n2025-12-01T09:08:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8fd70de5-a30a-465d-88bc-3c46781b6acf to /host/opt/cni/bin/\\\\n2025-12-01T09:08:20Z [verbose] multus-daemon started\\\\n2025-12-01T09:08:20Z [verbose] Readiness Indicator file check\\\\n2025-12-01T09:09:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:09:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2kcp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tj9fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.392346 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d0347f6-1984-4554-a31d-2d24225acd95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b438e5d02259e947be5d5089a9ac32fbc8350dc3405284a1f964893aceb81de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd763692f788ea59b96efddf685775bd84e2b2c0a65dc0f42046edaa1af170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ec19af9656f1e8902fe75ca4efb8163b4b2752f3dc34068c6d527ea0798ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae869ee9fbe3d353e9d6f83199e969ac80001996387141443ae2c0e83313a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49df9ea7c00e957b145971932c5d126c0d9bbebffd7af1c0607723129d1484ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://930f7c86b738b1a7e1dcbe936a40c95a3465d252dca30279f7f8a914c54e4b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3960379d7a475b21fa4d99692949afac596706c335029544d134bcca0c8b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812799b9ebca8d112fe3b7c2e309f8caca4fecda30920fa30ca648e02c72eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.404094 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcdd80a4-08b4-45b6-9559-e4ba382f17d0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb7508abad81d77278c9f93d6a866ef80f310a48a179dff68d8d630c5a5cb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0116ddbcf9fa956327663d364bab7e5374fc92b182321bdc348b4af4ac47d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdea1a5a16c58da916f9b5af087a90ba2d498d7fb4f405ad1c53f9da05d0e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f745093342307a14715256ab35747f2570ae845e9374376cf169da702608dfa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f745093342307a14715256ab35747f2570ae845e9374376cf169da702608dfa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.414941 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.427064 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.427119 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.427128 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.427141 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.427150 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:22Z","lastTransitionTime":"2025-12-01T09:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.427856 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd237749-4cea-4ff6-a374-8da70f9c879a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da76de171b82903909d1d59d764ef7f6551f2b01aa71e30a5cf67935628a7893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnb2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mt9t2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.449266 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:09:20Z\\\",\\\"message\\\":\\\"ter 0 failed attempt(s)\\\\nI1201 09:09:20.640201 6801 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-g6dw4\\\\nI1201 09:09:20.640058 6801 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-tdw66\\\\nI1201 09:09:20.640211 6801 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-tdw66 in node crc\\\\nI1201 09:09:20.640216 6801 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-tdw66 after 0 failed attempt(s)\\\\nI1201 09:09:20.640221 6801 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-tdw66\\\\nI1201 09:09:20.640061 6801 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI1201 09:09:20.640283 6801 kube.go:317] Updating pod openshift-multus/network-metrics-daemon-n7wvd\\\\nI1201 09:09:20.640044 6801 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1201 09:09:20.640441 6801 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:09:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4925\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kk2hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.461108 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"952d3740-c446-483d-805f-8c6a97cfbbd4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1201 09:08:16.821751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1201 09:08:16.821988 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:08:16.824281 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3970321760/tls.crt::/tmp/serving-cert-3970321760/tls.key\\\\\\\"\\\\nI1201 09:08:17.066478 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:08:17.074580 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:08:17.074605 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:08:17.074645 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:08:17.074652 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:08:17.082495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:08:17.082523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082528 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:08:17.082533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:08:17.082537 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:08:17.082541 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:08:17.082544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:08:17.082737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:08:17.085666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.471656 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.480927 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9j4ch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0da9082-ce5b-48ef-ad08-d3f3c75ea937\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0c20adcd34a2be3983acc26a65d86e4b1344472423c52135dce7012d559bf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgxd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9j4ch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.492875 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b0a64a-1b1c-49e2-9715-29505b2c124b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d5c06d9135265045d53f83c60fee67d6c6a6a029ca6c6f1d87f0db56f7f5e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af8ef02ae3c8d182752ab2174f0f96fb7a2ddc2652364fd79f9cfe6ab7bcb248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qfp4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zd2r2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.505561 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qv845\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:08:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.518531 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49f6e1b7-daf5-4334-94c1-9d58f2b81e29\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61e584042c408ba8ab82818404e520d549af409a05af80fc95cd18dfedfe0d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e153c5974daa706cb48413036395d69b97895ea274c6750a6f047eae26199f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58e153c5974daa706cb48413036395d69b97895ea274c6750a6f047eae26199f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:07:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:09:22Z is after 2025-08-24T17:21:41Z" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.529302 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.529337 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.529365 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.529380 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.529391 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:22Z","lastTransitionTime":"2025-12-01T09:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.632196 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.632960 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.633022 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.633058 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.633077 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:22Z","lastTransitionTime":"2025-12-01T09:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.735511 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.735558 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.735570 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.735587 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.735599 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:22Z","lastTransitionTime":"2025-12-01T09:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.837802 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.837981 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.838001 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.838028 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.838048 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:22Z","lastTransitionTime":"2025-12-01T09:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.941115 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.941161 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.941172 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.941187 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:22 crc kubenswrapper[4867]: I1201 09:09:22.941199 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:22Z","lastTransitionTime":"2025-12-01T09:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.043255 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.043317 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.043324 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.043337 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.043346 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:23Z","lastTransitionTime":"2025-12-01T09:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.145094 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.145192 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.145224 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.145253 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.145275 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:23Z","lastTransitionTime":"2025-12-01T09:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.247655 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.247688 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.247700 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.247714 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.247732 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:23Z","lastTransitionTime":"2025-12-01T09:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.350356 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.350427 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.350451 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.350482 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.350503 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:23Z","lastTransitionTime":"2025-12-01T09:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.452756 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.452800 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.452824 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.452840 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.452851 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:23Z","lastTransitionTime":"2025-12-01T09:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.555717 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.555761 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.555776 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.555794 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.555807 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:23Z","lastTransitionTime":"2025-12-01T09:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.658656 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.658740 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.658762 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.658788 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.658806 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:23Z","lastTransitionTime":"2025-12-01T09:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.761720 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.761769 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.761782 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.761800 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.761840 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:23Z","lastTransitionTime":"2025-12-01T09:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.826390 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.826480 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.826422 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.826399 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:23 crc kubenswrapper[4867]: E1201 09:09:23.826561 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:23 crc kubenswrapper[4867]: E1201 09:09:23.826649 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:23 crc kubenswrapper[4867]: E1201 09:09:23.826728 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:23 crc kubenswrapper[4867]: E1201 09:09:23.826851 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.864875 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.864955 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.864978 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.865005 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.865022 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:23Z","lastTransitionTime":"2025-12-01T09:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.967957 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.967998 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.968011 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.968030 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:23 crc kubenswrapper[4867]: I1201 09:09:23.968044 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:23Z","lastTransitionTime":"2025-12-01T09:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.070769 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.070797 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.070813 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.070854 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.070866 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:24Z","lastTransitionTime":"2025-12-01T09:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.173987 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.174038 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.174055 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.174077 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.174093 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:24Z","lastTransitionTime":"2025-12-01T09:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.276390 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.276746 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.277062 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.277176 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.277269 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:24Z","lastTransitionTime":"2025-12-01T09:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.379277 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.379356 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.379379 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.379409 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.379430 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:24Z","lastTransitionTime":"2025-12-01T09:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.482226 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.482265 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.482274 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.482287 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.482296 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:24Z","lastTransitionTime":"2025-12-01T09:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.584356 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.584397 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.584409 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.584425 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.584437 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:24Z","lastTransitionTime":"2025-12-01T09:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.687613 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.687682 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.687704 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.687731 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.687748 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:24Z","lastTransitionTime":"2025-12-01T09:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.790200 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.790293 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.790313 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.790336 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.790353 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:24Z","lastTransitionTime":"2025-12-01T09:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.893263 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.893302 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.893311 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.893324 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.893333 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:24Z","lastTransitionTime":"2025-12-01T09:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.995929 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.996000 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.996022 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.996052 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:24 crc kubenswrapper[4867]: I1201 09:09:24.996078 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:24Z","lastTransitionTime":"2025-12-01T09:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.098399 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.098439 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.098452 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.098468 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.098482 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:25Z","lastTransitionTime":"2025-12-01T09:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.201270 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.201319 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.201340 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.201370 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.201397 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:25Z","lastTransitionTime":"2025-12-01T09:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.304360 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.304411 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.304433 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.304458 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.304480 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:25Z","lastTransitionTime":"2025-12-01T09:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.407455 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.407505 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.407521 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.407544 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.407561 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:25Z","lastTransitionTime":"2025-12-01T09:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.509900 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.509954 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.509969 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.509991 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.510007 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:25Z","lastTransitionTime":"2025-12-01T09:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.612132 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.612167 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.612175 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.612187 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.612197 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:25Z","lastTransitionTime":"2025-12-01T09:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.714744 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.714780 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.714790 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.714806 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.714837 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:25Z","lastTransitionTime":"2025-12-01T09:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.817251 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.817283 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.817290 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.817301 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.817310 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:25Z","lastTransitionTime":"2025-12-01T09:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.826161 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.826219 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:25 crc kubenswrapper[4867]: E1201 09:09:25.826269 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.826368 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:25 crc kubenswrapper[4867]: E1201 09:09:25.826423 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.826412 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:25 crc kubenswrapper[4867]: E1201 09:09:25.826793 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:25 crc kubenswrapper[4867]: E1201 09:09:25.826887 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.919532 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.919571 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.919581 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.919597 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:25 crc kubenswrapper[4867]: I1201 09:09:25.919608 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:25Z","lastTransitionTime":"2025-12-01T09:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.022482 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.022540 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.022557 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.022578 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.022596 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:26Z","lastTransitionTime":"2025-12-01T09:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.125051 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.125120 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.125136 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.125160 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.125177 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:26Z","lastTransitionTime":"2025-12-01T09:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.228323 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.228372 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.228384 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.228400 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.228411 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:26Z","lastTransitionTime":"2025-12-01T09:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.330747 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.330787 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.330798 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.330835 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.330845 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:26Z","lastTransitionTime":"2025-12-01T09:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.433193 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.433277 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.433295 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.433323 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.433341 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:26Z","lastTransitionTime":"2025-12-01T09:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.537622 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.537665 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.537676 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.537692 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.537704 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:26Z","lastTransitionTime":"2025-12-01T09:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.639554 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.639596 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.639606 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.639621 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.639635 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:26Z","lastTransitionTime":"2025-12-01T09:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.742220 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.742257 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.742265 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.742279 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.742288 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:26Z","lastTransitionTime":"2025-12-01T09:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.844777 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.844838 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.844851 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.844868 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.844880 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:26Z","lastTransitionTime":"2025-12-01T09:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.948534 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.948598 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.948609 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.948628 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:26 crc kubenswrapper[4867]: I1201 09:09:26.948639 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:26Z","lastTransitionTime":"2025-12-01T09:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.052015 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.052078 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.052095 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.052119 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.052137 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:27Z","lastTransitionTime":"2025-12-01T09:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.154487 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.154540 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.154555 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.154578 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.154592 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:27Z","lastTransitionTime":"2025-12-01T09:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.257354 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.257401 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.257412 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.257429 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.257440 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:27Z","lastTransitionTime":"2025-12-01T09:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.360164 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.360198 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.360208 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.360222 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.360232 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:27Z","lastTransitionTime":"2025-12-01T09:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.463126 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.463189 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.463201 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.463241 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.463255 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:27Z","lastTransitionTime":"2025-12-01T09:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.565913 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.566002 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.566028 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.566048 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.566060 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:27Z","lastTransitionTime":"2025-12-01T09:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.668125 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.668152 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.668177 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.668191 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.668199 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:27Z","lastTransitionTime":"2025-12-01T09:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.770746 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.770906 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.770924 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.770954 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.770974 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:27Z","lastTransitionTime":"2025-12-01T09:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.826863 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.826924 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.826949 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.826971 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:27 crc kubenswrapper[4867]: E1201 09:09:27.827079 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:27 crc kubenswrapper[4867]: E1201 09:09:27.827277 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:27 crc kubenswrapper[4867]: E1201 09:09:27.827433 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:27 crc kubenswrapper[4867]: E1201 09:09:27.827568 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.874520 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.874581 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.874595 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.874622 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.874635 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:27Z","lastTransitionTime":"2025-12-01T09:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.978679 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.978740 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.978754 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.978779 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:27 crc kubenswrapper[4867]: I1201 09:09:27.978795 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:27Z","lastTransitionTime":"2025-12-01T09:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.081411 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.081458 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.081473 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.081495 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.081509 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:28Z","lastTransitionTime":"2025-12-01T09:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.183332 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.183397 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.183409 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.183425 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.183436 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:28Z","lastTransitionTime":"2025-12-01T09:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.286653 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.286692 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.286702 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.286718 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.286729 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:28Z","lastTransitionTime":"2025-12-01T09:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.388798 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.388854 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.388869 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.388887 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.388899 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:28Z","lastTransitionTime":"2025-12-01T09:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.491054 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.491111 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.491126 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.491148 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.491165 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:28Z","lastTransitionTime":"2025-12-01T09:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.593567 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.593633 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.593646 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.593662 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.593674 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:28Z","lastTransitionTime":"2025-12-01T09:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.696404 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.696437 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.696446 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.696460 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.696470 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:28Z","lastTransitionTime":"2025-12-01T09:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.798839 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.798882 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.798890 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.798905 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.798914 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:28Z","lastTransitionTime":"2025-12-01T09:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.844366 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=7.84435359 podStartE2EDuration="7.84435359s" podCreationTimestamp="2025-12-01 09:09:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:09:28.844170745 +0000 UTC m=+90.303557509" watchObservedRunningTime="2025-12-01 09:09:28.84435359 +0000 UTC m=+90.303740344" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.888066 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zd2r2" podStartSLOduration=70.888051195 podStartE2EDuration="1m10.888051195s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:09:28.887592753 +0000 UTC m=+90.346979507" watchObservedRunningTime="2025-12-01 09:09:28.888051195 +0000 UTC m=+90.347437949" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.888241 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9j4ch" podStartSLOduration=71.88823671 podStartE2EDuration="1m11.88823671s" podCreationTimestamp="2025-12-01 09:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:09:28.875543906 +0000 UTC m=+90.334930660" watchObservedRunningTime="2025-12-01 09:09:28.88823671 +0000 UTC m=+90.347623464" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.901554 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.901589 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.901600 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.901615 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.901627 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:28Z","lastTransitionTime":"2025-12-01T09:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.944138 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tdw66" podStartSLOduration=72.944123355 podStartE2EDuration="1m12.944123355s" podCreationTimestamp="2025-12-01 09:08:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:09:28.943959381 +0000 UTC m=+90.403346165" watchObservedRunningTime="2025-12-01 09:09:28.944123355 +0000 UTC m=+90.403510109" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.989709 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=71.98968626 podStartE2EDuration="1m11.98968626s" podCreationTimestamp="2025-12-01 09:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:09:28.974060736 +0000 UTC m=+90.433447490" watchObservedRunningTime="2025-12-01 09:09:28.98968626 +0000 UTC m=+90.449073064" Dec 01 09:09:28 crc kubenswrapper[4867]: I1201 09:09:28.990139 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=70.990120041 podStartE2EDuration="1m10.990120041s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:09:28.989941457 +0000 UTC m=+90.449328211" watchObservedRunningTime="2025-12-01 09:09:28.990120041 +0000 UTC m=+90.449506815" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.003188 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.003221 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.003233 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.003248 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.003260 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:29Z","lastTransitionTime":"2025-12-01T09:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.062991 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-g6dw4" podStartSLOduration=71.062973446 podStartE2EDuration="1m11.062973446s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:09:29.042528862 +0000 UTC m=+90.501915616" watchObservedRunningTime="2025-12-01 09:09:29.062973446 +0000 UTC m=+90.522360200" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.063269 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-tj9fl" podStartSLOduration=71.063264154 podStartE2EDuration="1m11.063264154s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:09:29.063181412 +0000 UTC m=+90.522568186" watchObservedRunningTime="2025-12-01 09:09:29.063264154 +0000 UTC m=+90.522650908" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.085263 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=72.08524175 podStartE2EDuration="1m12.08524175s" podCreationTimestamp="2025-12-01 09:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:09:29.084738366 +0000 UTC m=+90.544125130" watchObservedRunningTime="2025-12-01 09:09:29.08524175 +0000 UTC m=+90.544628514" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.105892 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.105959 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.105970 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.106004 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.106016 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:29Z","lastTransitionTime":"2025-12-01T09:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.109510 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=41.109498168 podStartE2EDuration="41.109498168s" podCreationTimestamp="2025-12-01 09:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:09:29.096468504 +0000 UTC m=+90.555855258" watchObservedRunningTime="2025-12-01 09:09:29.109498168 +0000 UTC m=+90.568884922" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.130411 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podStartSLOduration=72.130394564 podStartE2EDuration="1m12.130394564s" podCreationTimestamp="2025-12-01 09:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:09:29.129379236 +0000 UTC m=+90.588765990" watchObservedRunningTime="2025-12-01 09:09:29.130394564 +0000 UTC m=+90.589781318" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.209008 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.209056 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.209093 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.209111 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.209122 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:29Z","lastTransitionTime":"2025-12-01T09:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.311726 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.311777 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.311795 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.311846 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.311859 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:29Z","lastTransitionTime":"2025-12-01T09:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.414940 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.415230 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.415334 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.415421 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.415478 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:29Z","lastTransitionTime":"2025-12-01T09:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.518303 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.518355 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.518365 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.518381 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.518393 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:29Z","lastTransitionTime":"2025-12-01T09:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.621459 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.621746 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.621839 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.621905 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.621961 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:29Z","lastTransitionTime":"2025-12-01T09:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.724556 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.724599 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.724610 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.724628 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.724641 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:29Z","lastTransitionTime":"2025-12-01T09:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.825984 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:29 crc kubenswrapper[4867]: E1201 09:09:29.826153 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.826198 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.826275 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:29 crc kubenswrapper[4867]: E1201 09:09:29.826391 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.826593 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:29 crc kubenswrapper[4867]: E1201 09:09:29.826651 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:29 crc kubenswrapper[4867]: E1201 09:09:29.826728 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.827325 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.827365 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.827378 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.827393 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.827404 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:29Z","lastTransitionTime":"2025-12-01T09:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.929466 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.929500 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.929511 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.929526 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:29 crc kubenswrapper[4867]: I1201 09:09:29.929537 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:29Z","lastTransitionTime":"2025-12-01T09:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.032355 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.032395 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.032406 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.032423 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.032437 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:30Z","lastTransitionTime":"2025-12-01T09:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.135115 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.135159 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.135168 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.135183 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.135193 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:30Z","lastTransitionTime":"2025-12-01T09:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.237196 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.237237 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.237248 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.237263 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.237274 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:30Z","lastTransitionTime":"2025-12-01T09:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.339901 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.339963 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.339978 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.339996 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.340011 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:30Z","lastTransitionTime":"2025-12-01T09:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.442939 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.442993 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.443011 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.443036 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.443054 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:30Z","lastTransitionTime":"2025-12-01T09:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.545400 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.545456 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.545472 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.545496 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.545512 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:30Z","lastTransitionTime":"2025-12-01T09:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.648380 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.648446 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.648461 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.648487 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.648502 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:30Z","lastTransitionTime":"2025-12-01T09:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.752075 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.752145 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.752167 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.752196 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.752217 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:30Z","lastTransitionTime":"2025-12-01T09:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.854555 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.854592 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.854601 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.854616 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.854625 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:30Z","lastTransitionTime":"2025-12-01T09:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.956938 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.957013 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.957035 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.957059 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:30 crc kubenswrapper[4867]: I1201 09:09:30.957075 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:30Z","lastTransitionTime":"2025-12-01T09:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.060350 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.060403 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.060415 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.060432 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.060448 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:31Z","lastTransitionTime":"2025-12-01T09:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.163390 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.163460 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.163470 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.163482 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.163491 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:31Z","lastTransitionTime":"2025-12-01T09:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.266027 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.266092 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.266106 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.266122 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.266154 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:31Z","lastTransitionTime":"2025-12-01T09:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.368647 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.368682 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.368690 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.368702 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.368711 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:31Z","lastTransitionTime":"2025-12-01T09:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.470466 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.470502 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.470511 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.470524 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.470534 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:31Z","lastTransitionTime":"2025-12-01T09:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.573481 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.573547 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.573562 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.573585 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.573599 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:31Z","lastTransitionTime":"2025-12-01T09:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.676168 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.676221 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.676233 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.676250 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.676260 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:31Z","lastTransitionTime":"2025-12-01T09:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.779044 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.779116 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.779134 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.779160 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.779177 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:31Z","lastTransitionTime":"2025-12-01T09:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.826528 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.826585 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.826627 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.826609 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:31 crc kubenswrapper[4867]: E1201 09:09:31.826702 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:31 crc kubenswrapper[4867]: E1201 09:09:31.826852 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:31 crc kubenswrapper[4867]: E1201 09:09:31.827013 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:31 crc kubenswrapper[4867]: E1201 09:09:31.827083 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.882741 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.882803 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.883023 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.883047 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.883371 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:31Z","lastTransitionTime":"2025-12-01T09:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.985644 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.985668 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.985675 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.985687 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:31 crc kubenswrapper[4867]: I1201 09:09:31.985694 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:31Z","lastTransitionTime":"2025-12-01T09:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.089104 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.089150 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.089160 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.089175 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.089185 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:32Z","lastTransitionTime":"2025-12-01T09:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.192046 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.192090 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.192103 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.192119 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.192132 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:32Z","lastTransitionTime":"2025-12-01T09:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.294316 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.294389 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.294410 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.294436 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.294456 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:32Z","lastTransitionTime":"2025-12-01T09:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.375489 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.375568 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.375591 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.375624 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.375645 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:09:32Z","lastTransitionTime":"2025-12-01T09:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.427735 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8"] Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.428595 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.430297 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.430581 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.431308 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.431358 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.524009 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b66d362d-e1fa-464a-b6bf-adff2c124601-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m8sf8\" (UID: \"b66d362d-e1fa-464a-b6bf-adff2c124601\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.524088 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b66d362d-e1fa-464a-b6bf-adff2c124601-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m8sf8\" (UID: \"b66d362d-e1fa-464a-b6bf-adff2c124601\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.524121 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b66d362d-e1fa-464a-b6bf-adff2c124601-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m8sf8\" (UID: \"b66d362d-e1fa-464a-b6bf-adff2c124601\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.524225 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b66d362d-e1fa-464a-b6bf-adff2c124601-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m8sf8\" (UID: \"b66d362d-e1fa-464a-b6bf-adff2c124601\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.524333 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b66d362d-e1fa-464a-b6bf-adff2c124601-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m8sf8\" (UID: \"b66d362d-e1fa-464a-b6bf-adff2c124601\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.625377 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b66d362d-e1fa-464a-b6bf-adff2c124601-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m8sf8\" (UID: \"b66d362d-e1fa-464a-b6bf-adff2c124601\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.625452 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b66d362d-e1fa-464a-b6bf-adff2c124601-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m8sf8\" (UID: \"b66d362d-e1fa-464a-b6bf-adff2c124601\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.625492 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b66d362d-e1fa-464a-b6bf-adff2c124601-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m8sf8\" (UID: \"b66d362d-e1fa-464a-b6bf-adff2c124601\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.625529 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b66d362d-e1fa-464a-b6bf-adff2c124601-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m8sf8\" (UID: \"b66d362d-e1fa-464a-b6bf-adff2c124601\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.625534 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b66d362d-e1fa-464a-b6bf-adff2c124601-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m8sf8\" (UID: \"b66d362d-e1fa-464a-b6bf-adff2c124601\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.625573 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b66d362d-e1fa-464a-b6bf-adff2c124601-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m8sf8\" (UID: \"b66d362d-e1fa-464a-b6bf-adff2c124601\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.625670 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b66d362d-e1fa-464a-b6bf-adff2c124601-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m8sf8\" (UID: \"b66d362d-e1fa-464a-b6bf-adff2c124601\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.626450 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b66d362d-e1fa-464a-b6bf-adff2c124601-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m8sf8\" (UID: \"b66d362d-e1fa-464a-b6bf-adff2c124601\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.633585 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b66d362d-e1fa-464a-b6bf-adff2c124601-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m8sf8\" (UID: \"b66d362d-e1fa-464a-b6bf-adff2c124601\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.640858 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b66d362d-e1fa-464a-b6bf-adff2c124601-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m8sf8\" (UID: \"b66d362d-e1fa-464a-b6bf-adff2c124601\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8" Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.750929 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8" Dec 01 09:09:32 crc kubenswrapper[4867]: W1201 09:09:32.775334 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb66d362d_e1fa_464a_b6bf_adff2c124601.slice/crio-8fce392768160622e10874d10b9ec2fdba543270885fe51132e2f7063c90bfe3 WatchSource:0}: Error finding container 8fce392768160622e10874d10b9ec2fdba543270885fe51132e2f7063c90bfe3: Status 404 returned error can't find the container with id 8fce392768160622e10874d10b9ec2fdba543270885fe51132e2f7063c90bfe3 Dec 01 09:09:32 crc kubenswrapper[4867]: I1201 09:09:32.828270 4867 scope.go:117] "RemoveContainer" containerID="83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3" Dec 01 09:09:32 crc kubenswrapper[4867]: E1201 09:09:32.828796 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" Dec 01 09:09:33 crc kubenswrapper[4867]: I1201 09:09:33.296727 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8" event={"ID":"b66d362d-e1fa-464a-b6bf-adff2c124601","Type":"ContainerStarted","Data":"217db18038aab784a38e8f8a52bc2c81071f513094489f755cce23f4b61911ba"} Dec 01 09:09:33 crc kubenswrapper[4867]: I1201 09:09:33.297174 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8" event={"ID":"b66d362d-e1fa-464a-b6bf-adff2c124601","Type":"ContainerStarted","Data":"8fce392768160622e10874d10b9ec2fdba543270885fe51132e2f7063c90bfe3"} Dec 01 09:09:33 crc kubenswrapper[4867]: I1201 09:09:33.310345 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m8sf8" podStartSLOduration=76.310321497 podStartE2EDuration="1m16.310321497s" podCreationTimestamp="2025-12-01 09:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:09:33.309085094 +0000 UTC m=+94.768471868" watchObservedRunningTime="2025-12-01 09:09:33.310321497 +0000 UTC m=+94.769708281" Dec 01 09:09:33 crc kubenswrapper[4867]: I1201 09:09:33.826614 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:33 crc kubenswrapper[4867]: I1201 09:09:33.826611 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:33 crc kubenswrapper[4867]: I1201 09:09:33.826608 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:33 crc kubenswrapper[4867]: E1201 09:09:33.826860 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:33 crc kubenswrapper[4867]: E1201 09:09:33.826962 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:33 crc kubenswrapper[4867]: I1201 09:09:33.827026 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:33 crc kubenswrapper[4867]: E1201 09:09:33.827228 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:33 crc kubenswrapper[4867]: E1201 09:09:33.827032 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:35 crc kubenswrapper[4867]: I1201 09:09:35.759356 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs\") pod \"network-metrics-daemon-n7wvd\" (UID: \"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\") " pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:35 crc kubenswrapper[4867]: E1201 09:09:35.759499 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:09:35 crc kubenswrapper[4867]: E1201 09:09:35.759602 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs podName:c3ff1be1-b98b-483b-83ca-eb2255f66c7c nodeName:}" failed. No retries permitted until 2025-12-01 09:10:39.759576427 +0000 UTC m=+161.218963191 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs") pod "network-metrics-daemon-n7wvd" (UID: "c3ff1be1-b98b-483b-83ca-eb2255f66c7c") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:09:35 crc kubenswrapper[4867]: I1201 09:09:35.826602 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:35 crc kubenswrapper[4867]: I1201 09:09:35.826647 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:35 crc kubenswrapper[4867]: I1201 09:09:35.826691 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:35 crc kubenswrapper[4867]: I1201 09:09:35.826636 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:35 crc kubenswrapper[4867]: E1201 09:09:35.826806 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:35 crc kubenswrapper[4867]: E1201 09:09:35.827078 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:35 crc kubenswrapper[4867]: E1201 09:09:35.827151 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:35 crc kubenswrapper[4867]: E1201 09:09:35.827267 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:37 crc kubenswrapper[4867]: I1201 09:09:37.826203 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:37 crc kubenswrapper[4867]: I1201 09:09:37.826248 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:37 crc kubenswrapper[4867]: I1201 09:09:37.826254 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:37 crc kubenswrapper[4867]: I1201 09:09:37.826219 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:37 crc kubenswrapper[4867]: E1201 09:09:37.826355 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:37 crc kubenswrapper[4867]: E1201 09:09:37.826404 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:37 crc kubenswrapper[4867]: E1201 09:09:37.826491 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:37 crc kubenswrapper[4867]: E1201 09:09:37.826535 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:39 crc kubenswrapper[4867]: I1201 09:09:39.826464 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:39 crc kubenswrapper[4867]: E1201 09:09:39.826643 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:39 crc kubenswrapper[4867]: I1201 09:09:39.826910 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:39 crc kubenswrapper[4867]: E1201 09:09:39.826971 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:39 crc kubenswrapper[4867]: I1201 09:09:39.827095 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:39 crc kubenswrapper[4867]: I1201 09:09:39.827140 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:39 crc kubenswrapper[4867]: E1201 09:09:39.827479 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:39 crc kubenswrapper[4867]: E1201 09:09:39.827616 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:41 crc kubenswrapper[4867]: I1201 09:09:41.826651 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:41 crc kubenswrapper[4867]: I1201 09:09:41.826741 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:41 crc kubenswrapper[4867]: I1201 09:09:41.826781 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:41 crc kubenswrapper[4867]: I1201 09:09:41.826665 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:41 crc kubenswrapper[4867]: E1201 09:09:41.826866 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:41 crc kubenswrapper[4867]: E1201 09:09:41.827429 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:41 crc kubenswrapper[4867]: E1201 09:09:41.827706 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:41 crc kubenswrapper[4867]: E1201 09:09:41.827910 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:43 crc kubenswrapper[4867]: I1201 09:09:43.826765 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:43 crc kubenswrapper[4867]: I1201 09:09:43.826796 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:43 crc kubenswrapper[4867]: E1201 09:09:43.826885 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:43 crc kubenswrapper[4867]: E1201 09:09:43.827246 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:43 crc kubenswrapper[4867]: I1201 09:09:43.827548 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:43 crc kubenswrapper[4867]: E1201 09:09:43.828158 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:43 crc kubenswrapper[4867]: I1201 09:09:43.828642 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:43 crc kubenswrapper[4867]: E1201 09:09:43.828991 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:45 crc kubenswrapper[4867]: I1201 09:09:45.826459 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:45 crc kubenswrapper[4867]: I1201 09:09:45.826514 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:45 crc kubenswrapper[4867]: E1201 09:09:45.826596 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:45 crc kubenswrapper[4867]: I1201 09:09:45.826674 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:45 crc kubenswrapper[4867]: E1201 09:09:45.826790 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:45 crc kubenswrapper[4867]: E1201 09:09:45.826863 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:45 crc kubenswrapper[4867]: I1201 09:09:45.827598 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:45 crc kubenswrapper[4867]: E1201 09:09:45.827838 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:47 crc kubenswrapper[4867]: I1201 09:09:47.826697 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:47 crc kubenswrapper[4867]: I1201 09:09:47.827859 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:47 crc kubenswrapper[4867]: I1201 09:09:47.827148 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:47 crc kubenswrapper[4867]: I1201 09:09:47.827358 4867 scope.go:117] "RemoveContainer" containerID="83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3" Dec 01 09:09:47 crc kubenswrapper[4867]: E1201 09:09:47.828058 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kk2hn_openshift-ovn-kubernetes(8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" Dec 01 09:09:47 crc kubenswrapper[4867]: I1201 09:09:47.827018 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:47 crc kubenswrapper[4867]: E1201 09:09:47.828539 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:47 crc kubenswrapper[4867]: E1201 09:09:47.828629 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:47 crc kubenswrapper[4867]: E1201 09:09:47.828678 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:47 crc kubenswrapper[4867]: E1201 09:09:47.828721 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:49 crc kubenswrapper[4867]: I1201 09:09:49.826241 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:49 crc kubenswrapper[4867]: I1201 09:09:49.826240 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:49 crc kubenswrapper[4867]: I1201 09:09:49.826248 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:49 crc kubenswrapper[4867]: I1201 09:09:49.826376 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:49 crc kubenswrapper[4867]: E1201 09:09:49.826545 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:49 crc kubenswrapper[4867]: E1201 09:09:49.826650 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:49 crc kubenswrapper[4867]: E1201 09:09:49.826751 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:49 crc kubenswrapper[4867]: E1201 09:09:49.826883 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:51 crc kubenswrapper[4867]: I1201 09:09:51.826131 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:51 crc kubenswrapper[4867]: I1201 09:09:51.826189 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:51 crc kubenswrapper[4867]: I1201 09:09:51.826143 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:51 crc kubenswrapper[4867]: I1201 09:09:51.826138 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:51 crc kubenswrapper[4867]: E1201 09:09:51.826325 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:51 crc kubenswrapper[4867]: E1201 09:09:51.826383 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:51 crc kubenswrapper[4867]: E1201 09:09:51.826476 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:51 crc kubenswrapper[4867]: E1201 09:09:51.826655 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:52 crc kubenswrapper[4867]: I1201 09:09:52.364923 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tj9fl_c813b7ba-4c04-44d0-9f3e-3e5f4897fb73/kube-multus/1.log" Dec 01 09:09:52 crc kubenswrapper[4867]: I1201 09:09:52.365649 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tj9fl_c813b7ba-4c04-44d0-9f3e-3e5f4897fb73/kube-multus/0.log" Dec 01 09:09:52 crc kubenswrapper[4867]: I1201 09:09:52.365722 4867 generic.go:334] "Generic (PLEG): container finished" podID="c813b7ba-4c04-44d0-9f3e-3e5f4897fb73" containerID="5e6b9eb6ccecedc334e31540ce3540114856b189aa31e762086629439a0dab9b" exitCode=1 Dec 01 09:09:52 crc kubenswrapper[4867]: I1201 09:09:52.365765 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tj9fl" event={"ID":"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73","Type":"ContainerDied","Data":"5e6b9eb6ccecedc334e31540ce3540114856b189aa31e762086629439a0dab9b"} Dec 01 09:09:52 crc kubenswrapper[4867]: I1201 09:09:52.365869 4867 scope.go:117] "RemoveContainer" containerID="7d148348fbd978bef80a064946c0215238e9666ab74c19484b83066a7fda32a1" Dec 01 09:09:52 crc kubenswrapper[4867]: I1201 09:09:52.367782 4867 scope.go:117] "RemoveContainer" containerID="5e6b9eb6ccecedc334e31540ce3540114856b189aa31e762086629439a0dab9b" Dec 01 09:09:52 crc kubenswrapper[4867]: E1201 09:09:52.370010 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-tj9fl_openshift-multus(c813b7ba-4c04-44d0-9f3e-3e5f4897fb73)\"" pod="openshift-multus/multus-tj9fl" podUID="c813b7ba-4c04-44d0-9f3e-3e5f4897fb73" Dec 01 09:09:53 crc kubenswrapper[4867]: I1201 09:09:53.369953 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tj9fl_c813b7ba-4c04-44d0-9f3e-3e5f4897fb73/kube-multus/1.log" Dec 01 09:09:53 crc kubenswrapper[4867]: I1201 09:09:53.826545 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:53 crc kubenswrapper[4867]: I1201 09:09:53.826596 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:53 crc kubenswrapper[4867]: E1201 09:09:53.826695 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:53 crc kubenswrapper[4867]: I1201 09:09:53.826550 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:53 crc kubenswrapper[4867]: E1201 09:09:53.826795 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:53 crc kubenswrapper[4867]: I1201 09:09:53.826808 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:53 crc kubenswrapper[4867]: E1201 09:09:53.826898 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:53 crc kubenswrapper[4867]: E1201 09:09:53.826956 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:55 crc kubenswrapper[4867]: I1201 09:09:55.826981 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:55 crc kubenswrapper[4867]: I1201 09:09:55.827076 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:55 crc kubenswrapper[4867]: I1201 09:09:55.827078 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:55 crc kubenswrapper[4867]: E1201 09:09:55.827129 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:55 crc kubenswrapper[4867]: E1201 09:09:55.827245 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:55 crc kubenswrapper[4867]: I1201 09:09:55.827291 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:55 crc kubenswrapper[4867]: E1201 09:09:55.827612 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:55 crc kubenswrapper[4867]: E1201 09:09:55.827705 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:57 crc kubenswrapper[4867]: I1201 09:09:57.826854 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:57 crc kubenswrapper[4867]: I1201 09:09:57.826945 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:57 crc kubenswrapper[4867]: I1201 09:09:57.826992 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:57 crc kubenswrapper[4867]: I1201 09:09:57.827034 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:57 crc kubenswrapper[4867]: E1201 09:09:57.827095 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:57 crc kubenswrapper[4867]: E1201 09:09:57.827268 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:57 crc kubenswrapper[4867]: E1201 09:09:57.827517 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:09:57 crc kubenswrapper[4867]: E1201 09:09:57.827411 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:58 crc kubenswrapper[4867]: E1201 09:09:58.796724 4867 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 01 09:09:59 crc kubenswrapper[4867]: E1201 09:09:59.049336 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 09:09:59 crc kubenswrapper[4867]: I1201 09:09:59.826870 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:09:59 crc kubenswrapper[4867]: E1201 09:09:59.826986 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:09:59 crc kubenswrapper[4867]: I1201 09:09:59.827045 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:09:59 crc kubenswrapper[4867]: E1201 09:09:59.827094 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:09:59 crc kubenswrapper[4867]: I1201 09:09:59.827141 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:09:59 crc kubenswrapper[4867]: E1201 09:09:59.827188 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:09:59 crc kubenswrapper[4867]: I1201 09:09:59.827452 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:09:59 crc kubenswrapper[4867]: E1201 09:09:59.827503 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:10:01 crc kubenswrapper[4867]: I1201 09:10:01.827054 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:10:01 crc kubenswrapper[4867]: I1201 09:10:01.827058 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:10:01 crc kubenswrapper[4867]: E1201 09:10:01.827590 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:10:01 crc kubenswrapper[4867]: E1201 09:10:01.827708 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:10:01 crc kubenswrapper[4867]: I1201 09:10:01.827979 4867 scope.go:117] "RemoveContainer" containerID="83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3" Dec 01 09:10:01 crc kubenswrapper[4867]: I1201 09:10:01.828158 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:10:01 crc kubenswrapper[4867]: E1201 09:10:01.828396 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:10:01 crc kubenswrapper[4867]: I1201 09:10:01.828922 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:10:01 crc kubenswrapper[4867]: E1201 09:10:01.829563 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:10:02 crc kubenswrapper[4867]: I1201 09:10:02.404738 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kk2hn_8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/ovnkube-controller/3.log" Dec 01 09:10:02 crc kubenswrapper[4867]: I1201 09:10:02.407960 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerStarted","Data":"cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d"} Dec 01 09:10:02 crc kubenswrapper[4867]: I1201 09:10:02.408910 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:10:02 crc kubenswrapper[4867]: I1201 09:10:02.635543 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" podStartSLOduration=104.635525842 podStartE2EDuration="1m44.635525842s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:02.437008911 +0000 UTC m=+123.896395685" watchObservedRunningTime="2025-12-01 09:10:02.635525842 +0000 UTC m=+124.094912596" Dec 01 09:10:02 crc kubenswrapper[4867]: I1201 09:10:02.635965 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n7wvd"] Dec 01 09:10:02 crc kubenswrapper[4867]: I1201 09:10:02.636057 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:10:02 crc kubenswrapper[4867]: E1201 09:10:02.636134 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:10:03 crc kubenswrapper[4867]: I1201 09:10:03.826884 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:10:03 crc kubenswrapper[4867]: I1201 09:10:03.827438 4867 scope.go:117] "RemoveContainer" containerID="5e6b9eb6ccecedc334e31540ce3540114856b189aa31e762086629439a0dab9b" Dec 01 09:10:03 crc kubenswrapper[4867]: I1201 09:10:03.826914 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:10:03 crc kubenswrapper[4867]: E1201 09:10:03.827501 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:10:03 crc kubenswrapper[4867]: I1201 09:10:03.826954 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:10:03 crc kubenswrapper[4867]: I1201 09:10:03.826991 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:10:03 crc kubenswrapper[4867]: E1201 09:10:03.827760 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:10:03 crc kubenswrapper[4867]: E1201 09:10:03.827929 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:10:03 crc kubenswrapper[4867]: E1201 09:10:03.828023 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:10:04 crc kubenswrapper[4867]: E1201 09:10:04.050708 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 09:10:05 crc kubenswrapper[4867]: I1201 09:10:05.422136 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tj9fl_c813b7ba-4c04-44d0-9f3e-3e5f4897fb73/kube-multus/1.log" Dec 01 09:10:05 crc kubenswrapper[4867]: I1201 09:10:05.422497 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tj9fl" event={"ID":"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73","Type":"ContainerStarted","Data":"dcafca89c954d08442575fb0289c167ad5b6c82180418cf7ed2a2e73f47682cb"} Dec 01 09:10:05 crc kubenswrapper[4867]: I1201 09:10:05.826607 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:10:05 crc kubenswrapper[4867]: I1201 09:10:05.826692 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:10:05 crc kubenswrapper[4867]: I1201 09:10:05.826639 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:10:05 crc kubenswrapper[4867]: I1201 09:10:05.826607 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:10:05 crc kubenswrapper[4867]: E1201 09:10:05.826852 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:10:05 crc kubenswrapper[4867]: E1201 09:10:05.827016 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:10:05 crc kubenswrapper[4867]: E1201 09:10:05.827102 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:10:05 crc kubenswrapper[4867]: E1201 09:10:05.827193 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:10:07 crc kubenswrapper[4867]: I1201 09:10:07.826278 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:10:07 crc kubenswrapper[4867]: E1201 09:10:07.826447 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:10:07 crc kubenswrapper[4867]: I1201 09:10:07.826738 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:10:07 crc kubenswrapper[4867]: E1201 09:10:07.826856 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:10:07 crc kubenswrapper[4867]: I1201 09:10:07.827049 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:10:07 crc kubenswrapper[4867]: E1201 09:10:07.828796 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:10:07 crc kubenswrapper[4867]: I1201 09:10:07.827312 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:10:07 crc kubenswrapper[4867]: E1201 09:10:07.829114 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wvd" podUID="c3ff1be1-b98b-483b-83ca-eb2255f66c7c" Dec 01 09:10:09 crc kubenswrapper[4867]: I1201 09:10:09.826612 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:10:09 crc kubenswrapper[4867]: I1201 09:10:09.826660 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:10:09 crc kubenswrapper[4867]: I1201 09:10:09.826612 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:10:09 crc kubenswrapper[4867]: I1201 09:10:09.827173 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:10:09 crc kubenswrapper[4867]: I1201 09:10:09.831343 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 09:10:09 crc kubenswrapper[4867]: I1201 09:10:09.831645 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 09:10:09 crc kubenswrapper[4867]: I1201 09:10:09.831883 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 09:10:09 crc kubenswrapper[4867]: I1201 09:10:09.832254 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 09:10:09 crc kubenswrapper[4867]: I1201 09:10:09.834945 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 09:10:09 crc kubenswrapper[4867]: I1201 09:10:09.835265 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.063403 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.107222 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-g8jfw"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.107947 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.108474 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-545ws"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.109410 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7sqxq"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.110435 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7sqxq" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.111234 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-545ws" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.133143 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-xd8vg"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.133834 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xd8vg" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.134610 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.141844 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.143204 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.143579 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.145076 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.145101 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.145380 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.145697 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.146078 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.146240 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-4dj8h"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.146395 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.146478 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.146733 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.146852 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.147013 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.147192 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.151485 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.152371 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.152426 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.152535 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.152730 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.152834 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.152878 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.153095 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.153132 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.155791 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.167303 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.167583 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.167620 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.167770 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.168297 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.169019 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.169402 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.170447 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bnd8f"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.171059 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bnd8f" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.179999 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7sqxq"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.181569 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8e6598f7-031a-4561-bde4-ae61121a17cd-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.181624 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8b075132-2629-49ad-9361-42fe48ae5b57-audit\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.181652 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24dkw\" (UniqueName: \"kubernetes.io/projected/5e1caefa-d624-4351-89a3-d8c33a7924d6-kube-api-access-24dkw\") pod \"cluster-samples-operator-665b6dd947-7sqxq\" (UID: \"5e1caefa-d624-4351-89a3-d8c33a7924d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7sqxq" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.181675 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a-config\") pod \"machine-api-operator-5694c8668f-545ws\" (UID: \"e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-545ws" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.181694 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8b075132-2629-49ad-9361-42fe48ae5b57-etcd-client\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.181719 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b30e79f-de92-4b18-8b47-31cd45e753f1-client-ca\") pod \"route-controller-manager-6576b87f9c-vl9g7\" (UID: \"0b30e79f-de92-4b18-8b47-31cd45e753f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.181739 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/61de6f28-6c9e-4175-a1e3-e29e8fed45f6-auth-proxy-config\") pod \"machine-approver-56656f9798-xd8vg\" (UID: \"61de6f28-6c9e-4175-a1e3-e29e8fed45f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xd8vg" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.181757 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61de6f28-6c9e-4175-a1e3-e29e8fed45f6-config\") pod \"machine-approver-56656f9798-xd8vg\" (UID: \"61de6f28-6c9e-4175-a1e3-e29e8fed45f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xd8vg" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.181780 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b30e79f-de92-4b18-8b47-31cd45e753f1-config\") pod \"route-controller-manager-6576b87f9c-vl9g7\" (UID: \"0b30e79f-de92-4b18-8b47-31cd45e753f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.181801 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc693be6-558a-41e9-96cd-40061ff9ae5d-client-ca\") pod \"controller-manager-879f6c89f-g8jfw\" (UID: \"bc693be6-558a-41e9-96cd-40061ff9ae5d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.181849 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-555js\" (UniqueName: \"kubernetes.io/projected/8b075132-2629-49ad-9361-42fe48ae5b57-kube-api-access-555js\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.181868 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8e6598f7-031a-4561-bde4-ae61121a17cd-etcd-client\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.181888 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e6598f7-031a-4561-bde4-ae61121a17cd-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.181908 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68ns4\" (UniqueName: \"kubernetes.io/projected/bc693be6-558a-41e9-96cd-40061ff9ae5d-kube-api-access-68ns4\") pod \"controller-manager-879f6c89f-g8jfw\" (UID: \"bc693be6-558a-41e9-96cd-40061ff9ae5d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.181930 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b30e79f-de92-4b18-8b47-31cd45e753f1-serving-cert\") pod \"route-controller-manager-6576b87f9c-vl9g7\" (UID: \"0b30e79f-de92-4b18-8b47-31cd45e753f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.181950 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/61de6f28-6c9e-4175-a1e3-e29e8fed45f6-machine-approver-tls\") pod \"machine-approver-56656f9798-xd8vg\" (UID: \"61de6f28-6c9e-4175-a1e3-e29e8fed45f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xd8vg" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.181980 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc693be6-558a-41e9-96cd-40061ff9ae5d-serving-cert\") pod \"controller-manager-879f6c89f-g8jfw\" (UID: \"bc693be6-558a-41e9-96cd-40061ff9ae5d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182021 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8e6598f7-031a-4561-bde4-ae61121a17cd-audit-dir\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182037 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8b075132-2629-49ad-9361-42fe48ae5b57-etcd-serving-ca\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182052 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8e6598f7-031a-4561-bde4-ae61121a17cd-encryption-config\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182072 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-552vl\" (UniqueName: \"kubernetes.io/projected/e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a-kube-api-access-552vl\") pod \"machine-api-operator-5694c8668f-545ws\" (UID: \"e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-545ws" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182087 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b075132-2629-49ad-9361-42fe48ae5b57-config\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182103 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6598f7-031a-4561-bde4-ae61121a17cd-serving-cert\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182129 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e1caefa-d624-4351-89a3-d8c33a7924d6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7sqxq\" (UID: \"5e1caefa-d624-4351-89a3-d8c33a7924d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7sqxq" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182145 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8b075132-2629-49ad-9361-42fe48ae5b57-node-pullsecrets\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182219 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b075132-2629-49ad-9361-42fe48ae5b57-trusted-ca-bundle\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182236 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg2qq\" (UniqueName: \"kubernetes.io/projected/61de6f28-6c9e-4175-a1e3-e29e8fed45f6-kube-api-access-dg2qq\") pod \"machine-approver-56656f9798-xd8vg\" (UID: \"61de6f28-6c9e-4175-a1e3-e29e8fed45f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xd8vg" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182259 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-545ws\" (UID: \"e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-545ws" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182275 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8b075132-2629-49ad-9361-42fe48ae5b57-image-import-ca\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182289 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7dvp\" (UniqueName: \"kubernetes.io/projected/8e6598f7-031a-4561-bde4-ae61121a17cd-kube-api-access-h7dvp\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182305 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl9k6\" (UniqueName: \"kubernetes.io/projected/0b30e79f-de92-4b18-8b47-31cd45e753f1-kube-api-access-nl9k6\") pod \"route-controller-manager-6576b87f9c-vl9g7\" (UID: \"0b30e79f-de92-4b18-8b47-31cd45e753f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182318 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b075132-2629-49ad-9361-42fe48ae5b57-serving-cert\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182333 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b075132-2629-49ad-9361-42fe48ae5b57-audit-dir\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182359 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a-images\") pod \"machine-api-operator-5694c8668f-545ws\" (UID: \"e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-545ws" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182374 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc693be6-558a-41e9-96cd-40061ff9ae5d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-g8jfw\" (UID: \"bc693be6-558a-41e9-96cd-40061ff9ae5d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182388 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8e6598f7-031a-4561-bde4-ae61121a17cd-audit-policies\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182402 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc693be6-558a-41e9-96cd-40061ff9ae5d-config\") pod \"controller-manager-879f6c89f-g8jfw\" (UID: \"bc693be6-558a-41e9-96cd-40061ff9ae5d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.182443 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8b075132-2629-49ad-9361-42fe48ae5b57-encryption-config\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.183145 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-zh8bh"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.183669 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zq8v5"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.183782 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zh8bh" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.184770 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vsfps"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.185128 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsfps" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.185364 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zq8v5" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.185372 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kc82j"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.185708 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kc82j" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.186867 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7blkm"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.187204 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7blkm" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.188247 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p989q"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.188512 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.189742 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-n9k7d"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.190392 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.191387 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.199003 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.200532 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.203132 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.214132 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.214501 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.214613 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.214776 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.219003 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-t7zlw"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.219673 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-t7zlw" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.222074 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.223101 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.226374 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.226443 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.226664 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.226686 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.227047 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.227068 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.227172 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.227465 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.227493 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.227734 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.227840 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.227948 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.228970 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7kf2"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.229630 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7kf2" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.230791 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.230997 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.231118 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.232497 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.232684 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.232898 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.233387 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.233433 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.233782 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.233871 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.234014 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.234173 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.234263 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.233393 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.234430 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.234545 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.234555 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.234638 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.234677 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.234844 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.234892 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.234928 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.235004 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.235074 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.235102 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.235178 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.235222 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.235275 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.235317 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.235376 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.235436 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.235535 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.235610 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.235700 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.235768 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.235864 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.235999 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.237027 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.240561 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-959fr"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.241228 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-959fr" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.250278 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.250518 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.250679 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57cnj"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.250905 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.251141 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.250936 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.260595 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc54g"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.262681 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc54g" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.262964 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57cnj" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.265564 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.265958 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.266231 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.269050 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ndvfn"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.293943 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.294867 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.295242 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296272 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e6598f7-031a-4561-bde4-ae61121a17cd-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296304 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68ns4\" (UniqueName: \"kubernetes.io/projected/bc693be6-558a-41e9-96cd-40061ff9ae5d-kube-api-access-68ns4\") pod \"controller-manager-879f6c89f-g8jfw\" (UID: \"bc693be6-558a-41e9-96cd-40061ff9ae5d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296321 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b30e79f-de92-4b18-8b47-31cd45e753f1-serving-cert\") pod \"route-controller-manager-6576b87f9c-vl9g7\" (UID: \"0b30e79f-de92-4b18-8b47-31cd45e753f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296337 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/61de6f28-6c9e-4175-a1e3-e29e8fed45f6-machine-approver-tls\") pod \"machine-approver-56656f9798-xd8vg\" (UID: \"61de6f28-6c9e-4175-a1e3-e29e8fed45f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xd8vg" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296353 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc693be6-558a-41e9-96cd-40061ff9ae5d-serving-cert\") pod \"controller-manager-879f6c89f-g8jfw\" (UID: \"bc693be6-558a-41e9-96cd-40061ff9ae5d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296373 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8e6598f7-031a-4561-bde4-ae61121a17cd-audit-dir\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296393 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/56d53753-56f3-40fc-ba22-f635027ed42d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7blkm\" (UID: \"56d53753-56f3-40fc-ba22-f635027ed42d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7blkm" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296411 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gvb4\" (UniqueName: \"kubernetes.io/projected/56d53753-56f3-40fc-ba22-f635027ed42d-kube-api-access-4gvb4\") pod \"cluster-image-registry-operator-dc59b4c8b-7blkm\" (UID: \"56d53753-56f3-40fc-ba22-f635027ed42d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7blkm" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296428 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8b075132-2629-49ad-9361-42fe48ae5b57-etcd-serving-ca\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296442 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8e6598f7-031a-4561-bde4-ae61121a17cd-encryption-config\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296458 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-552vl\" (UniqueName: \"kubernetes.io/projected/e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a-kube-api-access-552vl\") pod \"machine-api-operator-5694c8668f-545ws\" (UID: \"e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-545ws" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296473 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b075132-2629-49ad-9361-42fe48ae5b57-config\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296488 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6598f7-031a-4561-bde4-ae61121a17cd-serving-cert\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296510 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e1caefa-d624-4351-89a3-d8c33a7924d6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7sqxq\" (UID: \"5e1caefa-d624-4351-89a3-d8c33a7924d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7sqxq" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296526 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8b075132-2629-49ad-9361-42fe48ae5b57-node-pullsecrets\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296541 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b075132-2629-49ad-9361-42fe48ae5b57-trusted-ca-bundle\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296556 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg2qq\" (UniqueName: \"kubernetes.io/projected/61de6f28-6c9e-4175-a1e3-e29e8fed45f6-kube-api-access-dg2qq\") pod \"machine-approver-56656f9798-xd8vg\" (UID: \"61de6f28-6c9e-4175-a1e3-e29e8fed45f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xd8vg" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296575 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8b075132-2629-49ad-9361-42fe48ae5b57-image-import-ca\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296589 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7dvp\" (UniqueName: \"kubernetes.io/projected/8e6598f7-031a-4561-bde4-ae61121a17cd-kube-api-access-h7dvp\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296603 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-545ws\" (UID: \"e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-545ws" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296617 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl9k6\" (UniqueName: \"kubernetes.io/projected/0b30e79f-de92-4b18-8b47-31cd45e753f1-kube-api-access-nl9k6\") pod \"route-controller-manager-6576b87f9c-vl9g7\" (UID: \"0b30e79f-de92-4b18-8b47-31cd45e753f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296630 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b075132-2629-49ad-9361-42fe48ae5b57-serving-cert\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296644 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b075132-2629-49ad-9361-42fe48ae5b57-audit-dir\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296660 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/56d53753-56f3-40fc-ba22-f635027ed42d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7blkm\" (UID: \"56d53753-56f3-40fc-ba22-f635027ed42d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7blkm" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296677 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a-images\") pod \"machine-api-operator-5694c8668f-545ws\" (UID: \"e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-545ws" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296692 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc693be6-558a-41e9-96cd-40061ff9ae5d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-g8jfw\" (UID: \"bc693be6-558a-41e9-96cd-40061ff9ae5d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296708 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8e6598f7-031a-4561-bde4-ae61121a17cd-audit-policies\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296721 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/56d53753-56f3-40fc-ba22-f635027ed42d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7blkm\" (UID: \"56d53753-56f3-40fc-ba22-f635027ed42d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7blkm" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296743 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc693be6-558a-41e9-96cd-40061ff9ae5d-config\") pod \"controller-manager-879f6c89f-g8jfw\" (UID: \"bc693be6-558a-41e9-96cd-40061ff9ae5d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296756 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8b075132-2629-49ad-9361-42fe48ae5b57-encryption-config\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296773 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8e6598f7-031a-4561-bde4-ae61121a17cd-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296793 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8b075132-2629-49ad-9361-42fe48ae5b57-audit\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296835 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24dkw\" (UniqueName: \"kubernetes.io/projected/5e1caefa-d624-4351-89a3-d8c33a7924d6-kube-api-access-24dkw\") pod \"cluster-samples-operator-665b6dd947-7sqxq\" (UID: \"5e1caefa-d624-4351-89a3-d8c33a7924d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7sqxq" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296852 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a-config\") pod \"machine-api-operator-5694c8668f-545ws\" (UID: \"e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-545ws" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296867 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8b075132-2629-49ad-9361-42fe48ae5b57-etcd-client\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296887 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b30e79f-de92-4b18-8b47-31cd45e753f1-client-ca\") pod \"route-controller-manager-6576b87f9c-vl9g7\" (UID: \"0b30e79f-de92-4b18-8b47-31cd45e753f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296900 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/61de6f28-6c9e-4175-a1e3-e29e8fed45f6-auth-proxy-config\") pod \"machine-approver-56656f9798-xd8vg\" (UID: \"61de6f28-6c9e-4175-a1e3-e29e8fed45f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xd8vg" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296914 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61de6f28-6c9e-4175-a1e3-e29e8fed45f6-config\") pod \"machine-approver-56656f9798-xd8vg\" (UID: \"61de6f28-6c9e-4175-a1e3-e29e8fed45f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xd8vg" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296928 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b30e79f-de92-4b18-8b47-31cd45e753f1-config\") pod \"route-controller-manager-6576b87f9c-vl9g7\" (UID: \"0b30e79f-de92-4b18-8b47-31cd45e753f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296958 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc693be6-558a-41e9-96cd-40061ff9ae5d-client-ca\") pod \"controller-manager-879f6c89f-g8jfw\" (UID: \"bc693be6-558a-41e9-96cd-40061ff9ae5d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296973 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-555js\" (UniqueName: \"kubernetes.io/projected/8b075132-2629-49ad-9361-42fe48ae5b57-kube-api-access-555js\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.296986 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8e6598f7-031a-4561-bde4-ae61121a17cd-etcd-client\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.297751 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.298394 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-szthm"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.298686 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-szthm" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.298865 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndvfn" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.299229 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8b075132-2629-49ad-9361-42fe48ae5b57-etcd-serving-ca\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.299599 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.300088 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e6598f7-031a-4561-bde4-ae61121a17cd-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.300129 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8e6598f7-031a-4561-bde4-ae61121a17cd-audit-dir\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.300359 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.303009 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e1caefa-d624-4351-89a3-d8c33a7924d6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7sqxq\" (UID: \"5e1caefa-d624-4351-89a3-d8c33a7924d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7sqxq" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.303500 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.303754 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-kdm2m"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.304147 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mph7x"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.304389 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-l44jd"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.304676 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.305043 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.305185 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mph7x" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.308659 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.313788 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.314619 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b075132-2629-49ad-9361-42fe48ae5b57-config\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.315659 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.314043 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8e6598f7-031a-4561-bde4-ae61121a17cd-encryption-config\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.317062 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.317365 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rd976"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.317996 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t266c"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.318248 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5zdqv"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.318561 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5zdqv" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.318786 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.318980 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rd976" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.319119 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.319621 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc693be6-558a-41e9-96cd-40061ff9ae5d-serving-cert\") pod \"controller-manager-879f6c89f-g8jfw\" (UID: \"bc693be6-558a-41e9-96cd-40061ff9ae5d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.321766 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a-images\") pod \"machine-api-operator-5694c8668f-545ws\" (UID: \"e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-545ws" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.322419 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/61de6f28-6c9e-4175-a1e3-e29e8fed45f6-machine-approver-tls\") pod \"machine-approver-56656f9798-xd8vg\" (UID: \"61de6f28-6c9e-4175-a1e3-e29e8fed45f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xd8vg" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.323105 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b30e79f-de92-4b18-8b47-31cd45e753f1-serving-cert\") pod \"route-controller-manager-6576b87f9c-vl9g7\" (UID: \"0b30e79f-de92-4b18-8b47-31cd45e753f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.323896 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b30e79f-de92-4b18-8b47-31cd45e753f1-client-ca\") pod \"route-controller-manager-6576b87f9c-vl9g7\" (UID: \"0b30e79f-de92-4b18-8b47-31cd45e753f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.323993 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6598f7-031a-4561-bde4-ae61121a17cd-serving-cert\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.324065 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b075132-2629-49ad-9361-42fe48ae5b57-audit-dir\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.324277 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8b075132-2629-49ad-9361-42fe48ae5b57-audit\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.324898 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a-config\") pod \"machine-api-operator-5694c8668f-545ws\" (UID: \"e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-545ws" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.326288 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b075132-2629-49ad-9361-42fe48ae5b57-serving-cert\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.326586 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc693be6-558a-41e9-96cd-40061ff9ae5d-config\") pod \"controller-manager-879f6c89f-g8jfw\" (UID: \"bc693be6-558a-41e9-96cd-40061ff9ae5d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.327081 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b30e79f-de92-4b18-8b47-31cd45e753f1-config\") pod \"route-controller-manager-6576b87f9c-vl9g7\" (UID: \"0b30e79f-de92-4b18-8b47-31cd45e753f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.327474 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/61de6f28-6c9e-4175-a1e3-e29e8fed45f6-auth-proxy-config\") pod \"machine-approver-56656f9798-xd8vg\" (UID: \"61de6f28-6c9e-4175-a1e3-e29e8fed45f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xd8vg" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.327748 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61de6f28-6c9e-4175-a1e3-e29e8fed45f6-config\") pod \"machine-approver-56656f9798-xd8vg\" (UID: \"61de6f28-6c9e-4175-a1e3-e29e8fed45f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xd8vg" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.328139 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8e6598f7-031a-4561-bde4-ae61121a17cd-audit-policies\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.328755 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc693be6-558a-41e9-96cd-40061ff9ae5d-client-ca\") pod \"controller-manager-879f6c89f-g8jfw\" (UID: \"bc693be6-558a-41e9-96cd-40061ff9ae5d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.328947 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8b075132-2629-49ad-9361-42fe48ae5b57-node-pullsecrets\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.329585 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b075132-2629-49ad-9361-42fe48ae5b57-trusted-ca-bundle\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.329787 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc693be6-558a-41e9-96cd-40061ff9ae5d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-g8jfw\" (UID: \"bc693be6-558a-41e9-96cd-40061ff9ae5d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.329869 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wj7xq"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.330385 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wj7xq" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.330547 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.343787 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-545ws\" (UID: \"e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-545ws" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.344095 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8e6598f7-031a-4561-bde4-ae61121a17cd-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.344217 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.353736 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8e6598f7-031a-4561-bde4-ae61121a17cd-etcd-client\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.354049 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.354654 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8b075132-2629-49ad-9361-42fe48ae5b57-encryption-config\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.354836 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9n2mx"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.354950 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.355594 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8b075132-2629-49ad-9361-42fe48ae5b57-etcd-client\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.357284 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8b075132-2629-49ad-9361-42fe48ae5b57-image-import-ca\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.366280 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.384929 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9spn"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.385462 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.385489 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9n2mx" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.385853 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4hs5"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.386391 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4hs5" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.385535 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9spn" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.386899 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-kbx9t"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.387530 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-kbx9t" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.388531 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bdwkr"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.389297 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6w6sm"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.389656 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.390129 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.390703 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.401440 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.405156 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c6qt4"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.405636 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.405649 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-545ws"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.405746 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c6qt4" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.415286 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/56d53753-56f3-40fc-ba22-f635027ed42d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7blkm\" (UID: \"56d53753-56f3-40fc-ba22-f635027ed42d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7blkm" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.415315 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gvb4\" (UniqueName: \"kubernetes.io/projected/56d53753-56f3-40fc-ba22-f635027ed42d-kube-api-access-4gvb4\") pod \"cluster-image-registry-operator-dc59b4c8b-7blkm\" (UID: \"56d53753-56f3-40fc-ba22-f635027ed42d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7blkm" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.415397 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-bdwkr" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.415522 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/56d53753-56f3-40fc-ba22-f635027ed42d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7blkm\" (UID: \"56d53753-56f3-40fc-ba22-f635027ed42d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7blkm" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.415576 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/56d53753-56f3-40fc-ba22-f635027ed42d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7blkm\" (UID: \"56d53753-56f3-40fc-ba22-f635027ed42d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7blkm" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.417035 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/56d53753-56f3-40fc-ba22-f635027ed42d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7blkm\" (UID: \"56d53753-56f3-40fc-ba22-f635027ed42d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7blkm" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.417834 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.418047 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.418079 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-v6frd"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.418390 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/56d53753-56f3-40fc-ba22-f635027ed42d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7blkm\" (UID: \"56d53753-56f3-40fc-ba22-f635027ed42d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7blkm" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.419009 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-g8jfw"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.419098 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v6frd" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.420185 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vsfps"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.422302 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7kf2"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.423261 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ndvfn"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.424507 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-szthm"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.425095 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc54g"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.426068 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kc82j"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.427215 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-4dj8h"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.429744 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-kbx9t"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.430109 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.431237 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p989q"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.431670 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rd976"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.433096 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-n9k7d"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.433867 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9n2mx"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.434998 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zh8bh"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.435876 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6w6sm"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.437201 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-26zfv"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.437942 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bnd8f"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.438127 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-26zfv" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.439419 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4rxmf"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.440407 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5zdqv"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.440490 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.441279 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9spn"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.442387 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wj7xq"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.444003 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.444961 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bdwkr"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.446884 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zq8v5"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.448064 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57cnj"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.449329 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-kdm2m"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.449667 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.450690 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t266c"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.451918 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4hs5"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.453331 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7blkm"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.454526 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-959fr"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.455781 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.457052 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-t7zlw"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.458302 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mph7x"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.463736 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c6qt4"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.464961 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4rxmf"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.468466 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qq8rb"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.469734 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qq8rb" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.471586 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-v6frd"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.473688 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qq8rb"] Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.509854 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.530552 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.550859 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.570328 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.589587 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.610606 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.630874 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.650092 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.671055 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.690250 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.711151 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.731105 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.750516 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.786281 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-552vl\" (UniqueName: \"kubernetes.io/projected/e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a-kube-api-access-552vl\") pod \"machine-api-operator-5694c8668f-545ws\" (UID: \"e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-545ws" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.791391 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.810896 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.831326 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.851531 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.870485 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.897949 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.910967 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.930952 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.952095 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.970321 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 09:10:13 crc kubenswrapper[4867]: I1201 09:10:13.991339 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.011327 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.029803 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.040242 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-545ws" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.050648 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.090405 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.097750 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl9k6\" (UniqueName: \"kubernetes.io/projected/0b30e79f-de92-4b18-8b47-31cd45e753f1-kube-api-access-nl9k6\") pod \"route-controller-manager-6576b87f9c-vl9g7\" (UID: \"0b30e79f-de92-4b18-8b47-31cd45e753f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.111129 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.130306 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.156135 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.170860 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.191380 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.210798 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.250128 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.252491 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24dkw\" (UniqueName: \"kubernetes.io/projected/5e1caefa-d624-4351-89a3-d8c33a7924d6-kube-api-access-24dkw\") pod \"cluster-samples-operator-665b6dd947-7sqxq\" (UID: \"5e1caefa-d624-4351-89a3-d8c33a7924d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7sqxq" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.271602 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.275635 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-545ws"] Dec 01 09:10:14 crc kubenswrapper[4867]: W1201 09:10:14.283779 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0f5215e_ef8c_4b8f_8b3f_ecfb5deac62a.slice/crio-8f6a75e17a7afb9f6e257634322d4594c39e06056c8d96e06b5611bfe2f81662 WatchSource:0}: Error finding container 8f6a75e17a7afb9f6e257634322d4594c39e06056c8d96e06b5611bfe2f81662: Status 404 returned error can't find the container with id 8f6a75e17a7afb9f6e257634322d4594c39e06056c8d96e06b5611bfe2f81662 Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.305037 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-555js\" (UniqueName: \"kubernetes.io/projected/8b075132-2629-49ad-9361-42fe48ae5b57-kube-api-access-555js\") pod \"apiserver-76f77b778f-4dj8h\" (UID: \"8b075132-2629-49ad-9361-42fe48ae5b57\") " pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.325564 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68ns4\" (UniqueName: \"kubernetes.io/projected/bc693be6-558a-41e9-96cd-40061ff9ae5d-kube-api-access-68ns4\") pod \"controller-manager-879f6c89f-g8jfw\" (UID: \"bc693be6-558a-41e9-96cd-40061ff9ae5d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.331169 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7sqxq" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.345008 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg2qq\" (UniqueName: \"kubernetes.io/projected/61de6f28-6c9e-4175-a1e3-e29e8fed45f6-kube-api-access-dg2qq\") pod \"machine-approver-56656f9798-xd8vg\" (UID: \"61de6f28-6c9e-4175-a1e3-e29e8fed45f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xd8vg" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.348993 4867 request.go:700] Waited for 1.018419277s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.350983 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.369737 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xd8vg" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.371549 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.384876 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.390549 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.402560 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.410576 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.430086 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.452350 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.463406 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xd8vg" event={"ID":"61de6f28-6c9e-4175-a1e3-e29e8fed45f6","Type":"ContainerStarted","Data":"5cfd4d851f6b32382862deaa32d21602912801ebee49d3b923fc410e77e0e591"} Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.464833 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-545ws" event={"ID":"e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a","Type":"ContainerStarted","Data":"8f6a75e17a7afb9f6e257634322d4594c39e06056c8d96e06b5611bfe2f81662"} Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.485244 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7dvp\" (UniqueName: \"kubernetes.io/projected/8e6598f7-031a-4561-bde4-ae61121a17cd-kube-api-access-h7dvp\") pod \"apiserver-7bbb656c7d-s6z2w\" (UID: \"8e6598f7-031a-4561-bde4-ae61121a17cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.492395 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.550413 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.550633 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.551415 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.552799 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7sqxq"] Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.571268 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.590900 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.610582 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.622125 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.630478 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.649941 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.658886 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7"] Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.673268 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.693418 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.711156 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.711194 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.731899 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.733191 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-4dj8h"] Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.750014 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 09:10:14 crc kubenswrapper[4867]: W1201 09:10:14.755881 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b075132_2629_49ad_9361_42fe48ae5b57.slice/crio-3a38639fb6a149ba8425809d3457af7449b19b22409bc686f845edfc425ff25a WatchSource:0}: Error finding container 3a38639fb6a149ba8425809d3457af7449b19b22409bc686f845edfc425ff25a: Status 404 returned error can't find the container with id 3a38639fb6a149ba8425809d3457af7449b19b22409bc686f845edfc425ff25a Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.769486 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.810495 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.812398 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gvb4\" (UniqueName: \"kubernetes.io/projected/56d53753-56f3-40fc-ba22-f635027ed42d-kube-api-access-4gvb4\") pod \"cluster-image-registry-operator-dc59b4c8b-7blkm\" (UID: \"56d53753-56f3-40fc-ba22-f635027ed42d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7blkm" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.812514 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-g8jfw"] Dec 01 09:10:14 crc kubenswrapper[4867]: W1201 09:10:14.827330 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc693be6_558a_41e9_96cd_40061ff9ae5d.slice/crio-20aefd935014f46d0cfa81d7c2d4350472d82d5d470c1167a6da74bf5a2026f7 WatchSource:0}: Error finding container 20aefd935014f46d0cfa81d7c2d4350472d82d5d470c1167a6da74bf5a2026f7: Status 404 returned error can't find the container with id 20aefd935014f46d0cfa81d7c2d4350472d82d5d470c1167a6da74bf5a2026f7 Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.831057 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.850099 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.883665 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.890504 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.930695 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.935511 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/56d53753-56f3-40fc-ba22-f635027ed42d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7blkm\" (UID: \"56d53753-56f3-40fc-ba22-f635027ed42d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7blkm" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.950782 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.970919 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.971997 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w"] Dec 01 09:10:14 crc kubenswrapper[4867]: W1201 09:10:14.984484 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e6598f7_031a_4561_bde4_ae61121a17cd.slice/crio-595d579b969178006af9cae8ab872f10f1d609b8dc2c8a745423d72cd2cd39d3 WatchSource:0}: Error finding container 595d579b969178006af9cae8ab872f10f1d609b8dc2c8a745423d72cd2cd39d3: Status 404 returned error can't find the container with id 595d579b969178006af9cae8ab872f10f1d609b8dc2c8a745423d72cd2cd39d3 Dec 01 09:10:14 crc kubenswrapper[4867]: I1201 09:10:14.990418 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.011593 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.031031 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.051966 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.070804 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.090552 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.109878 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.130864 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.150324 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.170612 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7blkm" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.172136 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.191103 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.209696 4867 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.231243 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.250422 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.271438 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.353773 4867 request.go:700] Waited for 1.881085146s due to client-side throttling, not priority and fairness, request: PATCH:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-t7zlw/status Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.356277 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.356426 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.356519 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.369404 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7blkm"] Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455023 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95dlj\" (UniqueName: \"kubernetes.io/projected/d00d9bfd-cd31-44f5-8b56-d14af3823d29-kube-api-access-95dlj\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455082 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1162eaab-3879-41de-8561-e544952c9b3c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ndvfn\" (UID: \"1162eaab-3879-41de-8561-e544952c9b3c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndvfn" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455104 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ac0e0a1-006e-4a5b-88e8-8cb429978c2f-config\") pod \"console-operator-58897d9998-bnd8f\" (UID: \"6ac0e0a1-006e-4a5b-88e8-8cb429978c2f\") " pod="openshift-console-operator/console-operator-58897d9998-bnd8f" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455148 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf9eddc0-c88c-4527-a1db-f472b220f253-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-h7kf2\" (UID: \"cf9eddc0-c88c-4527-a1db-f472b220f253\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7kf2" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455167 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d00d9bfd-cd31-44f5-8b56-d14af3823d29-registry-certificates\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455352 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455390 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e41b788-2056-4058-89de-1a8cf9885735-metrics-tls\") pod \"ingress-operator-5b745b69d9-959fr\" (UID: \"6e41b788-2056-4058-89de-1a8cf9885735\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-959fr" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455416 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ac0e0a1-006e-4a5b-88e8-8cb429978c2f-trusted-ca\") pod \"console-operator-58897d9998-bnd8f\" (UID: \"6ac0e0a1-006e-4a5b-88e8-8cb429978c2f\") " pod="openshift-console-operator/console-operator-58897d9998-bnd8f" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455477 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455507 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5wc5\" (UniqueName: \"kubernetes.io/projected/6ac0e0a1-006e-4a5b-88e8-8cb429978c2f-kube-api-access-n5wc5\") pod \"console-operator-58897d9998-bnd8f\" (UID: \"6ac0e0a1-006e-4a5b-88e8-8cb429978c2f\") " pod="openshift-console-operator/console-operator-58897d9998-bnd8f" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455542 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2207687f-bb58-4e3e-b0ca-bc1117a21d91-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-57cnj\" (UID: \"2207687f-bb58-4e3e-b0ca-bc1117a21d91\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57cnj" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455562 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d00d9bfd-cd31-44f5-8b56-d14af3823d29-ca-trust-extracted\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455588 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de3134ab-8adb-4427-b246-f89bed1610ed-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zq8v5\" (UID: \"de3134ab-8adb-4427-b246-f89bed1610ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zq8v5" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455652 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-audit-policies\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455674 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf9eddc0-c88c-4527-a1db-f472b220f253-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-h7kf2\" (UID: \"cf9eddc0-c88c-4527-a1db-f472b220f253\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7kf2" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455694 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d00d9bfd-cd31-44f5-8b56-d14af3823d29-trusted-ca\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455716 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kmcj\" (UniqueName: \"kubernetes.io/projected/c699ddca-61b5-4f9a-ae7c-48653d9557f8-kube-api-access-9kmcj\") pod \"downloads-7954f5f757-zh8bh\" (UID: \"c699ddca-61b5-4f9a-ae7c-48653d9557f8\") " pod="openshift-console/downloads-7954f5f757-zh8bh" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455856 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455891 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49436ed5-4757-4aa2-92cb-63c65928893a-audit-dir\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455912 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8d96\" (UniqueName: \"kubernetes.io/projected/d4ba76c3-7126-4151-8aaa-5d4aa710a0ea-kube-api-access-k8d96\") pod \"authentication-operator-69f744f599-t7zlw\" (UID: \"d4ba76c3-7126-4151-8aaa-5d4aa710a0ea\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-t7zlw" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455936 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455959 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.455985 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d00d9bfd-cd31-44f5-8b56-d14af3823d29-bound-sa-token\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456045 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: E1201 09:10:15.456296 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:15.956282046 +0000 UTC m=+137.415668790 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456338 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kr7z\" (UniqueName: \"kubernetes.io/projected/cf0d3295-6cbb-4b9c-bed7-d37c328ddea4-kube-api-access-2kr7z\") pod \"openshift-config-operator-7777fb866f-kc82j\" (UID: \"cf0d3295-6cbb-4b9c-bed7-d37c328ddea4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kc82j" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456366 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d27d4386-9536-4d85-877c-3bea1ebae95c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lc54g\" (UID: \"d27d4386-9536-4d85-877c-3bea1ebae95c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc54g" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456404 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d00d9bfd-cd31-44f5-8b56-d14af3823d29-registry-tls\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456422 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4ba76c3-7126-4151-8aaa-5d4aa710a0ea-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-t7zlw\" (UID: \"d4ba76c3-7126-4151-8aaa-5d4aa710a0ea\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-t7zlw" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456438 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1162eaab-3879-41de-8561-e544952c9b3c-proxy-tls\") pod \"machine-config-controller-84d6567774-ndvfn\" (UID: \"1162eaab-3879-41de-8561-e544952c9b3c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndvfn" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456461 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456476 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456499 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456528 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/cf0d3295-6cbb-4b9c-bed7-d37c328ddea4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kc82j\" (UID: \"cf0d3295-6cbb-4b9c-bed7-d37c328ddea4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kc82j" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456563 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456610 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf0d3295-6cbb-4b9c-bed7-d37c328ddea4-serving-cert\") pod \"openshift-config-operator-7777fb866f-kc82j\" (UID: \"cf0d3295-6cbb-4b9c-bed7-d37c328ddea4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kc82j" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456636 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ac0e0a1-006e-4a5b-88e8-8cb429978c2f-serving-cert\") pod \"console-operator-58897d9998-bnd8f\" (UID: \"6ac0e0a1-006e-4a5b-88e8-8cb429978c2f\") " pod="openshift-console-operator/console-operator-58897d9998-bnd8f" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456658 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4ba76c3-7126-4151-8aaa-5d4aa710a0ea-serving-cert\") pod \"authentication-operator-69f744f599-t7zlw\" (UID: \"d4ba76c3-7126-4151-8aaa-5d4aa710a0ea\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-t7zlw" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456706 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e41b788-2056-4058-89de-1a8cf9885735-bound-sa-token\") pod \"ingress-operator-5b745b69d9-959fr\" (UID: \"6e41b788-2056-4058-89de-1a8cf9885735\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-959fr" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456759 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ba76c3-7126-4151-8aaa-5d4aa710a0ea-config\") pod \"authentication-operator-69f744f599-t7zlw\" (UID: \"d4ba76c3-7126-4151-8aaa-5d4aa710a0ea\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-t7zlw" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456796 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hj6l\" (UniqueName: \"kubernetes.io/projected/2207687f-bb58-4e3e-b0ca-bc1117a21d91-kube-api-access-7hj6l\") pod \"kube-storage-version-migrator-operator-b67b599dd-57cnj\" (UID: \"2207687f-bb58-4e3e-b0ca-bc1117a21d91\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57cnj" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456890 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e41b788-2056-4058-89de-1a8cf9885735-trusted-ca\") pod \"ingress-operator-5b745b69d9-959fr\" (UID: \"6e41b788-2056-4058-89de-1a8cf9885735\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-959fr" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456919 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d00d9bfd-cd31-44f5-8b56-d14af3823d29-installation-pull-secrets\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456938 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de3134ab-8adb-4427-b246-f89bed1610ed-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zq8v5\" (UID: \"de3134ab-8adb-4427-b246-f89bed1610ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zq8v5" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.456961 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.457019 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnv9l\" (UniqueName: \"kubernetes.io/projected/49436ed5-4757-4aa2-92cb-63c65928893a-kube-api-access-hnv9l\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.457040 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpczg\" (UniqueName: \"kubernetes.io/projected/6e41b788-2056-4058-89de-1a8cf9885735-kube-api-access-jpczg\") pod \"ingress-operator-5b745b69d9-959fr\" (UID: \"6e41b788-2056-4058-89de-1a8cf9885735\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-959fr" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.457074 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78zsn\" (UniqueName: \"kubernetes.io/projected/1162eaab-3879-41de-8561-e544952c9b3c-kube-api-access-78zsn\") pod \"machine-config-controller-84d6567774-ndvfn\" (UID: \"1162eaab-3879-41de-8561-e544952c9b3c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndvfn" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.457105 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs5m7\" (UniqueName: \"kubernetes.io/projected/05fcf7ce-2e60-4fc5-b60b-be4b8b5d4963-kube-api-access-hs5m7\") pod \"migrator-59844c95c7-vsfps\" (UID: \"05fcf7ce-2e60-4fc5-b60b-be4b8b5d4963\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsfps" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.457123 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.457140 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2207687f-bb58-4e3e-b0ca-bc1117a21d91-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-57cnj\" (UID: \"2207687f-bb58-4e3e-b0ca-bc1117a21d91\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57cnj" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.457164 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bmvq\" (UniqueName: \"kubernetes.io/projected/d27d4386-9536-4d85-877c-3bea1ebae95c-kube-api-access-7bmvq\") pod \"package-server-manager-789f6589d5-lc54g\" (UID: \"d27d4386-9536-4d85-877c-3bea1ebae95c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc54g" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.457178 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf9eddc0-c88c-4527-a1db-f472b220f253-config\") pod \"kube-apiserver-operator-766d6c64bb-h7kf2\" (UID: \"cf9eddc0-c88c-4527-a1db-f472b220f253\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7kf2" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.457195 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de3134ab-8adb-4427-b246-f89bed1610ed-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zq8v5\" (UID: \"de3134ab-8adb-4427-b246-f89bed1610ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zq8v5" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.457235 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4ba76c3-7126-4151-8aaa-5d4aa710a0ea-service-ca-bundle\") pod \"authentication-operator-69f744f599-t7zlw\" (UID: \"d4ba76c3-7126-4151-8aaa-5d4aa710a0ea\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-t7zlw" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.489089 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" event={"ID":"0b30e79f-de92-4b18-8b47-31cd45e753f1","Type":"ContainerStarted","Data":"1665cf0223812cc9388614badc6496840c24c829fb8af49496e17c7ff8f86ee5"} Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.489136 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" event={"ID":"0b30e79f-de92-4b18-8b47-31cd45e753f1","Type":"ContainerStarted","Data":"f35a74beeca72b2818af8e39226899c048dd1428ac7aba6d3aed41d4552cc761"} Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.489546 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.491438 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" event={"ID":"8e6598f7-031a-4561-bde4-ae61121a17cd","Type":"ContainerStarted","Data":"595d579b969178006af9cae8ab872f10f1d609b8dc2c8a745423d72cd2cd39d3"} Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.500505 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7sqxq" event={"ID":"5e1caefa-d624-4351-89a3-d8c33a7924d6","Type":"ContainerStarted","Data":"a3e22df0c8b89d25d94dede4ea8398d5ab1b9798aa6dec0cd1fc361d78415f89"} Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.500558 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7sqxq" event={"ID":"5e1caefa-d624-4351-89a3-d8c33a7924d6","Type":"ContainerStarted","Data":"f2bac75276205f6bac45fbec74e162e34a704e2d9aad99564d833d028faf6d8b"} Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.500571 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7blkm" event={"ID":"56d53753-56f3-40fc-ba22-f635027ed42d","Type":"ContainerStarted","Data":"59750fd863c5b7e7ef6473ae513daff54c5600981f8636e50191e437e0287a95"} Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.502145 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xd8vg" event={"ID":"61de6f28-6c9e-4175-a1e3-e29e8fed45f6","Type":"ContainerStarted","Data":"acfd4b6b2df9d88e0cba2806b0d59531de11bdfb8f8826288b18d4473987af15"} Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.504238 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-545ws" event={"ID":"e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a","Type":"ContainerStarted","Data":"effbdeeeba899662bb342c503a3d5c4698bfe2b6273360550fa8fe035d0ddea3"} Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.504263 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-545ws" event={"ID":"e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a","Type":"ContainerStarted","Data":"277b1005ffe8f2951d738498ab12edd637d8823f64f34e6670c9922d443e5ebf"} Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.505606 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" event={"ID":"bc693be6-558a-41e9-96cd-40061ff9ae5d","Type":"ContainerStarted","Data":"20aefd935014f46d0cfa81d7c2d4350472d82d5d470c1167a6da74bf5a2026f7"} Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.519136 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" event={"ID":"8b075132-2629-49ad-9361-42fe48ae5b57","Type":"ContainerStarted","Data":"3a38639fb6a149ba8425809d3457af7449b19b22409bc686f845edfc425ff25a"} Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.567680 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.567864 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d00d9bfd-cd31-44f5-8b56-d14af3823d29-installation-pull-secrets\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.567894 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de3134ab-8adb-4427-b246-f89bed1610ed-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zq8v5\" (UID: \"de3134ab-8adb-4427-b246-f89bed1610ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zq8v5" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.567918 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e41b788-2056-4058-89de-1a8cf9885735-trusted-ca\") pod \"ingress-operator-5b745b69d9-959fr\" (UID: \"6e41b788-2056-4058-89de-1a8cf9885735\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-959fr" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.567968 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-console-config\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568006 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftvbh\" (UniqueName: \"kubernetes.io/projected/c9d5feec-4daa-4cea-a996-0a179e42de9f-kube-api-access-ftvbh\") pod \"openshift-apiserver-operator-796bbdcf4f-r4hs5\" (UID: \"c9d5feec-4daa-4cea-a996-0a179e42de9f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4hs5" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568055 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a62b4517-3937-44e9-8062-982896975a9d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5zdqv\" (UID: \"a62b4517-3937-44e9-8062-982896975a9d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5zdqv" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568076 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a9e0b3a5-0f6d-47cb-a2e7-a09dc8f90d5d-node-bootstrap-token\") pod \"machine-config-server-26zfv\" (UID: \"a9e0b3a5-0f6d-47cb-a2e7-a09dc8f90d5d\") " pod="openshift-machine-config-operator/machine-config-server-26zfv" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568096 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec06a7ff-9325-4de9-b47e-d8315761bf8d-console-serving-cert\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568121 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4xws\" (UniqueName: \"kubernetes.io/projected/ec06a7ff-9325-4de9-b47e-d8315761bf8d-kube-api-access-q4xws\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568157 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnv9l\" (UniqueName: \"kubernetes.io/projected/49436ed5-4757-4aa2-92cb-63c65928893a-kube-api-access-hnv9l\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568180 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpczg\" (UniqueName: \"kubernetes.io/projected/6e41b788-2056-4058-89de-1a8cf9885735-kube-api-access-jpczg\") pod \"ingress-operator-5b745b69d9-959fr\" (UID: \"6e41b788-2056-4058-89de-1a8cf9885735\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-959fr" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568201 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45e5dcb3-d55f-40cf-a89f-3367e84322d1-config-volume\") pod \"collect-profiles-29409660-lcc6h\" (UID: \"45e5dcb3-d55f-40cf-a89f-3367e84322d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568236 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5zjj\" (UniqueName: \"kubernetes.io/projected/2e1b0e60-347c-458c-853a-301c03aeb597-kube-api-access-w5zjj\") pod \"service-ca-operator-777779d784-wj7xq\" (UID: \"2e1b0e60-347c-458c-853a-301c03aeb597\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wj7xq" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568270 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs5m7\" (UniqueName: \"kubernetes.io/projected/05fcf7ce-2e60-4fc5-b60b-be4b8b5d4963-kube-api-access-hs5m7\") pod \"migrator-59844c95c7-vsfps\" (UID: \"05fcf7ce-2e60-4fc5-b60b-be4b8b5d4963\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsfps" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568295 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568317 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2207687f-bb58-4e3e-b0ca-bc1117a21d91-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-57cnj\" (UID: \"2207687f-bb58-4e3e-b0ca-bc1117a21d91\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57cnj" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568341 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1b0e60-347c-458c-853a-301c03aeb597-config\") pod \"service-ca-operator-777779d784-wj7xq\" (UID: \"2e1b0e60-347c-458c-853a-301c03aeb597\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wj7xq" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568366 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/997ee678-8d54-4f91-af1d-4eefd5006f85-images\") pod \"machine-config-operator-74547568cd-rd976\" (UID: \"997ee678-8d54-4f91-af1d-4eefd5006f85\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rd976" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568386 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5f2052f6-d2cd-4aba-b254-bea0cb7b6aba-stats-auth\") pod \"router-default-5444994796-l44jd\" (UID: \"5f2052f6-d2cd-4aba-b254-bea0cb7b6aba\") " pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568410 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0db10378-99f0-49e1-9148-5a0441652430-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-c6qt4\" (UID: \"0db10378-99f0-49e1-9148-5a0441652430\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c6qt4" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568432 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/311f8a58-9f7d-4fbb-b6c2-462364f8ad76-csi-data-dir\") pod \"csi-hostpathplugin-4rxmf\" (UID: \"311f8a58-9f7d-4fbb-b6c2-462364f8ad76\") " pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568456 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4ba76c3-7126-4151-8aaa-5d4aa710a0ea-service-ca-bundle\") pod \"authentication-operator-69f744f599-t7zlw\" (UID: \"d4ba76c3-7126-4151-8aaa-5d4aa710a0ea\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-t7zlw" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568479 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7wdb\" (UniqueName: \"kubernetes.io/projected/cc49fbcc-d2e4-47a3-a1ea-7c414726b20d-kube-api-access-r7wdb\") pod \"catalog-operator-68c6474976-c9spn\" (UID: \"cc49fbcc-d2e4-47a3-a1ea-7c414726b20d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9spn" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568501 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0db10378-99f0-49e1-9148-5a0441652430-config\") pod \"kube-controller-manager-operator-78b949d7b-c6qt4\" (UID: \"0db10378-99f0-49e1-9148-5a0441652430\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c6qt4" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568520 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-oauth-serving-cert\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568545 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d00d9bfd-cd31-44f5-8b56-d14af3823d29-registry-certificates\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568575 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3e950295-ab80-42c7-b1f2-f5869448b824-etcd-client\") pod \"etcd-operator-b45778765-t266c\" (UID: \"3e950295-ab80-42c7-b1f2-f5869448b824\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568597 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568619 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e41b788-2056-4058-89de-1a8cf9885735-metrics-tls\") pod \"ingress-operator-5b745b69d9-959fr\" (UID: \"6e41b788-2056-4058-89de-1a8cf9885735\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-959fr" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568654 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f7f1347-239a-4608-9972-2a50b5216725-cert\") pod \"ingress-canary-qq8rb\" (UID: \"7f7f1347-239a-4608-9972-2a50b5216725\") " pod="openshift-ingress-canary/ingress-canary-qq8rb" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568676 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/222fb4b4-7d0d-4305-9ba9-9f686dc10dd6-profile-collector-cert\") pod \"olm-operator-6b444d44fb-szthm\" (UID: \"222fb4b4-7d0d-4305-9ba9-9f686dc10dd6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-szthm" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568716 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-trusted-ca-bundle\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568762 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d00d9bfd-cd31-44f5-8b56-d14af3823d29-ca-trust-extracted\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568798 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/311f8a58-9f7d-4fbb-b6c2-462364f8ad76-mountpoint-dir\") pod \"csi-hostpathplugin-4rxmf\" (UID: \"311f8a58-9f7d-4fbb-b6c2-462364f8ad76\") " pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568855 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec06a7ff-9325-4de9-b47e-d8315761bf8d-console-oauth-config\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568877 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e950295-ab80-42c7-b1f2-f5869448b824-etcd-service-ca\") pod \"etcd-operator-b45778765-t266c\" (UID: \"3e950295-ab80-42c7-b1f2-f5869448b824\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568903 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf9eddc0-c88c-4527-a1db-f472b220f253-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-h7kf2\" (UID: \"cf9eddc0-c88c-4527-a1db-f472b220f253\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7kf2" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568926 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfx2j\" (UniqueName: \"kubernetes.io/projected/6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44-kube-api-access-rfx2j\") pod \"dns-default-v6frd\" (UID: \"6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44\") " pod="openshift-dns/dns-default-v6frd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568948 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e31393e0-d2c9-4310-83a3-5958278ea7a2-tmpfs\") pod \"packageserver-d55dfcdfc-rlc72\" (UID: \"e31393e0-d2c9-4310-83a3-5958278ea7a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.568970 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24xwc\" (UniqueName: \"kubernetes.io/projected/2e876542-f5c3-495f-bb1c-72c1136686ac-kube-api-access-24xwc\") pod \"dns-operator-744455d44c-kbx9t\" (UID: \"2e876542-f5c3-495f-bb1c-72c1136686ac\") " pod="openshift-dns-operator/dns-operator-744455d44c-kbx9t" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569004 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d00d9bfd-cd31-44f5-8b56-d14af3823d29-trusted-ca\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569028 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kmcj\" (UniqueName: \"kubernetes.io/projected/c699ddca-61b5-4f9a-ae7c-48653d9557f8-kube-api-access-9kmcj\") pod \"downloads-7954f5f757-zh8bh\" (UID: \"c699ddca-61b5-4f9a-ae7c-48653d9557f8\") " pod="openshift-console/downloads-7954f5f757-zh8bh" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569051 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/997ee678-8d54-4f91-af1d-4eefd5006f85-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rd976\" (UID: \"997ee678-8d54-4f91-af1d-4eefd5006f85\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rd976" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569072 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c48142f7-e9a6-49ee-b638-89e198befd03-signing-cabundle\") pod \"service-ca-9c57cc56f-9n2mx\" (UID: \"c48142f7-e9a6-49ee-b638-89e198befd03\") " pod="openshift-service-ca/service-ca-9c57cc56f-9n2mx" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569094 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9mnf\" (UniqueName: \"kubernetes.io/projected/45e5dcb3-d55f-40cf-a89f-3367e84322d1-kube-api-access-k9mnf\") pod \"collect-profiles-29409660-lcc6h\" (UID: \"45e5dcb3-d55f-40cf-a89f-3367e84322d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569116 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff715e51-b68c-408d-87d9-7360399c9d9d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bdwkr\" (UID: \"ff715e51-b68c-408d-87d9-7360399c9d9d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bdwkr" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569140 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569164 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7d3f2ef-022b-41f5-84e5-6be42f48b023-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6w6sm\" (UID: \"c7d3f2ef-022b-41f5-84e5-6be42f48b023\") " pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569186 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9d5feec-4daa-4cea-a996-0a179e42de9f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-r4hs5\" (UID: \"c9d5feec-4daa-4cea-a996-0a179e42de9f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4hs5" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569217 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44-config-volume\") pod \"dns-default-v6frd\" (UID: \"6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44\") " pod="openshift-dns/dns-default-v6frd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569239 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a9e0b3a5-0f6d-47cb-a2e7-a09dc8f90d5d-certs\") pod \"machine-config-server-26zfv\" (UID: \"a9e0b3a5-0f6d-47cb-a2e7-a09dc8f90d5d\") " pod="openshift-machine-config-operator/machine-config-server-26zfv" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569261 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq7ff\" (UniqueName: \"kubernetes.io/projected/997ee678-8d54-4f91-af1d-4eefd5006f85-kube-api-access-bq7ff\") pod \"machine-config-operator-74547568cd-rd976\" (UID: \"997ee678-8d54-4f91-af1d-4eefd5006f85\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rd976" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569282 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0db10378-99f0-49e1-9148-5a0441652430-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-c6qt4\" (UID: \"0db10378-99f0-49e1-9148-5a0441652430\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c6qt4" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569308 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d00d9bfd-cd31-44f5-8b56-d14af3823d29-registry-tls\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569331 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569354 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/311f8a58-9f7d-4fbb-b6c2-462364f8ad76-registration-dir\") pod \"csi-hostpathplugin-4rxmf\" (UID: \"311f8a58-9f7d-4fbb-b6c2-462364f8ad76\") " pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569378 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ac0e0a1-006e-4a5b-88e8-8cb429978c2f-serving-cert\") pod \"console-operator-58897d9998-bnd8f\" (UID: \"6ac0e0a1-006e-4a5b-88e8-8cb429978c2f\") " pod="openshift-console-operator/console-operator-58897d9998-bnd8f" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569399 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4ba76c3-7126-4151-8aaa-5d4aa710a0ea-serving-cert\") pod \"authentication-operator-69f744f599-t7zlw\" (UID: \"d4ba76c3-7126-4151-8aaa-5d4aa710a0ea\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-t7zlw" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569434 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/cf0d3295-6cbb-4b9c-bed7-d37c328ddea4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kc82j\" (UID: \"cf0d3295-6cbb-4b9c-bed7-d37c328ddea4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kc82j" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569456 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cc49fbcc-d2e4-47a3-a1ea-7c414726b20d-profile-collector-cert\") pod \"catalog-operator-68c6474976-c9spn\" (UID: \"cc49fbcc-d2e4-47a3-a1ea-7c414726b20d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9spn" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569485 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ba76c3-7126-4151-8aaa-5d4aa710a0ea-config\") pod \"authentication-operator-69f744f599-t7zlw\" (UID: \"d4ba76c3-7126-4151-8aaa-5d4aa710a0ea\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-t7zlw" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569508 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hj6l\" (UniqueName: \"kubernetes.io/projected/2207687f-bb58-4e3e-b0ca-bc1117a21d91-kube-api-access-7hj6l\") pod \"kube-storage-version-migrator-operator-b67b599dd-57cnj\" (UID: \"2207687f-bb58-4e3e-b0ca-bc1117a21d91\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57cnj" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569529 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/311f8a58-9f7d-4fbb-b6c2-462364f8ad76-socket-dir\") pod \"csi-hostpathplugin-4rxmf\" (UID: \"311f8a58-9f7d-4fbb-b6c2-462364f8ad76\") " pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569553 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569577 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlbxt\" (UniqueName: \"kubernetes.io/projected/3e950295-ab80-42c7-b1f2-f5869448b824-kube-api-access-nlbxt\") pod \"etcd-operator-b45778765-t266c\" (UID: \"3e950295-ab80-42c7-b1f2-f5869448b824\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569602 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtnn6\" (UniqueName: \"kubernetes.io/projected/678a6c46-6e4c-4ec0-aa74-7c89c6dc00b5-kube-api-access-wtnn6\") pod \"control-plane-machine-set-operator-78cbb6b69f-mph7x\" (UID: \"678a6c46-6e4c-4ec0-aa74-7c89c6dc00b5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mph7x" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569623 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f2052f6-d2cd-4aba-b254-bea0cb7b6aba-metrics-certs\") pod \"router-default-5444994796-l44jd\" (UID: \"5f2052f6-d2cd-4aba-b254-bea0cb7b6aba\") " pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569649 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78zsn\" (UniqueName: \"kubernetes.io/projected/1162eaab-3879-41de-8561-e544952c9b3c-kube-api-access-78zsn\") pod \"machine-config-controller-84d6567774-ndvfn\" (UID: \"1162eaab-3879-41de-8561-e544952c9b3c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndvfn" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569676 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgbrs\" (UniqueName: \"kubernetes.io/projected/a9e0b3a5-0f6d-47cb-a2e7-a09dc8f90d5d-kube-api-access-rgbrs\") pod \"machine-config-server-26zfv\" (UID: \"a9e0b3a5-0f6d-47cb-a2e7-a09dc8f90d5d\") " pod="openshift-machine-config-operator/machine-config-server-26zfv" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569699 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5f2052f6-d2cd-4aba-b254-bea0cb7b6aba-default-certificate\") pod \"router-default-5444994796-l44jd\" (UID: \"5f2052f6-d2cd-4aba-b254-bea0cb7b6aba\") " pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569725 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bmvq\" (UniqueName: \"kubernetes.io/projected/d27d4386-9536-4d85-877c-3bea1ebae95c-kube-api-access-7bmvq\") pod \"package-server-manager-789f6589d5-lc54g\" (UID: \"d27d4386-9536-4d85-877c-3bea1ebae95c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc54g" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569749 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf9eddc0-c88c-4527-a1db-f472b220f253-config\") pod \"kube-apiserver-operator-766d6c64bb-h7kf2\" (UID: \"cf9eddc0-c88c-4527-a1db-f472b220f253\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7kf2" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569771 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de3134ab-8adb-4427-b246-f89bed1610ed-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zq8v5\" (UID: \"de3134ab-8adb-4427-b246-f89bed1610ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zq8v5" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569797 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7d3f2ef-022b-41f5-84e5-6be42f48b023-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6w6sm\" (UID: \"c7d3f2ef-022b-41f5-84e5-6be42f48b023\") " pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569837 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e1b0e60-347c-458c-853a-301c03aeb597-serving-cert\") pod \"service-ca-operator-777779d784-wj7xq\" (UID: \"2e1b0e60-347c-458c-853a-301c03aeb597\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wj7xq" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569869 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45e5dcb3-d55f-40cf-a89f-3367e84322d1-secret-volume\") pod \"collect-profiles-29409660-lcc6h\" (UID: \"45e5dcb3-d55f-40cf-a89f-3367e84322d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569897 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95dlj\" (UniqueName: \"kubernetes.io/projected/d00d9bfd-cd31-44f5-8b56-d14af3823d29-kube-api-access-95dlj\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569936 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1162eaab-3879-41de-8561-e544952c9b3c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ndvfn\" (UID: \"1162eaab-3879-41de-8561-e544952c9b3c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndvfn" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.569964 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de3134ab-8adb-4427-b246-f89bed1610ed-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zq8v5\" (UID: \"de3134ab-8adb-4427-b246-f89bed1610ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zq8v5" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570006 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ac0e0a1-006e-4a5b-88e8-8cb429978c2f-config\") pod \"console-operator-58897d9998-bnd8f\" (UID: \"6ac0e0a1-006e-4a5b-88e8-8cb429978c2f\") " pod="openshift-console-operator/console-operator-58897d9998-bnd8f" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570035 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf9eddc0-c88c-4527-a1db-f472b220f253-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-h7kf2\" (UID: \"cf9eddc0-c88c-4527-a1db-f472b220f253\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7kf2" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570058 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-service-ca\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570079 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3e950295-ab80-42c7-b1f2-f5869448b824-etcd-ca\") pod \"etcd-operator-b45778765-t266c\" (UID: \"3e950295-ab80-42c7-b1f2-f5869448b824\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570104 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/222fb4b4-7d0d-4305-9ba9-9f686dc10dd6-srv-cert\") pod \"olm-operator-6b444d44fb-szthm\" (UID: \"222fb4b4-7d0d-4305-9ba9-9f686dc10dd6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-szthm" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570126 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e950295-ab80-42c7-b1f2-f5869448b824-config\") pod \"etcd-operator-b45778765-t266c\" (UID: \"3e950295-ab80-42c7-b1f2-f5869448b824\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570163 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ac0e0a1-006e-4a5b-88e8-8cb429978c2f-trusted-ca\") pod \"console-operator-58897d9998-bnd8f\" (UID: \"6ac0e0a1-006e-4a5b-88e8-8cb429978c2f\") " pod="openshift-console-operator/console-operator-58897d9998-bnd8f" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570190 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq7t2\" (UniqueName: \"kubernetes.io/projected/7f7f1347-239a-4608-9972-2a50b5216725-kube-api-access-fq7t2\") pod \"ingress-canary-qq8rb\" (UID: \"7f7f1347-239a-4608-9972-2a50b5216725\") " pod="openshift-ingress-canary/ingress-canary-qq8rb" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570213 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2207687f-bb58-4e3e-b0ca-bc1117a21d91-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-57cnj\" (UID: \"2207687f-bb58-4e3e-b0ca-bc1117a21d91\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57cnj" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570238 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570261 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5wc5\" (UniqueName: \"kubernetes.io/projected/6ac0e0a1-006e-4a5b-88e8-8cb429978c2f-kube-api-access-n5wc5\") pod \"console-operator-58897d9998-bnd8f\" (UID: \"6ac0e0a1-006e-4a5b-88e8-8cb429978c2f\") " pod="openshift-console-operator/console-operator-58897d9998-bnd8f" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570297 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de3134ab-8adb-4427-b246-f89bed1610ed-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zq8v5\" (UID: \"de3134ab-8adb-4427-b246-f89bed1610ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zq8v5" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570322 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/997ee678-8d54-4f91-af1d-4eefd5006f85-proxy-tls\") pod \"machine-config-operator-74547568cd-rd976\" (UID: \"997ee678-8d54-4f91-af1d-4eefd5006f85\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rd976" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570345 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnvlc\" (UniqueName: \"kubernetes.io/projected/311f8a58-9f7d-4fbb-b6c2-462364f8ad76-kube-api-access-fnvlc\") pod \"csi-hostpathplugin-4rxmf\" (UID: \"311f8a58-9f7d-4fbb-b6c2-462364f8ad76\") " pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570367 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a62b4517-3937-44e9-8062-982896975a9d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5zdqv\" (UID: \"a62b4517-3937-44e9-8062-982896975a9d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5zdqv" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570397 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-audit-policies\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570428 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e31393e0-d2c9-4310-83a3-5958278ea7a2-webhook-cert\") pod \"packageserver-d55dfcdfc-rlc72\" (UID: \"e31393e0-d2c9-4310-83a3-5958278ea7a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570461 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49436ed5-4757-4aa2-92cb-63c65928893a-audit-dir\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570486 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8d96\" (UniqueName: \"kubernetes.io/projected/d4ba76c3-7126-4151-8aaa-5d4aa710a0ea-kube-api-access-k8d96\") pod \"authentication-operator-69f744f599-t7zlw\" (UID: \"d4ba76c3-7126-4151-8aaa-5d4aa710a0ea\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-t7zlw" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570517 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570549 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z7p5\" (UniqueName: \"kubernetes.io/projected/c7d3f2ef-022b-41f5-84e5-6be42f48b023-kube-api-access-7z7p5\") pod \"marketplace-operator-79b997595-6w6sm\" (UID: \"c7d3f2ef-022b-41f5-84e5-6be42f48b023\") " pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570580 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chpmd\" (UniqueName: \"kubernetes.io/projected/e31393e0-d2c9-4310-83a3-5958278ea7a2-kube-api-access-chpmd\") pod \"packageserver-d55dfcdfc-rlc72\" (UID: \"e31393e0-d2c9-4310-83a3-5958278ea7a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570615 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d00d9bfd-cd31-44f5-8b56-d14af3823d29-bound-sa-token\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570648 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570682 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44-metrics-tls\") pod \"dns-default-v6frd\" (UID: \"6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44\") " pod="openshift-dns/dns-default-v6frd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570714 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht24r\" (UniqueName: \"kubernetes.io/projected/222fb4b4-7d0d-4305-9ba9-9f686dc10dd6-kube-api-access-ht24r\") pod \"olm-operator-6b444d44fb-szthm\" (UID: \"222fb4b4-7d0d-4305-9ba9-9f686dc10dd6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-szthm" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570749 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9d5feec-4daa-4cea-a996-0a179e42de9f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-r4hs5\" (UID: \"c9d5feec-4daa-4cea-a996-0a179e42de9f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4hs5" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570784 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kr7z\" (UniqueName: \"kubernetes.io/projected/cf0d3295-6cbb-4b9c-bed7-d37c328ddea4-kube-api-access-2kr7z\") pod \"openshift-config-operator-7777fb866f-kc82j\" (UID: \"cf0d3295-6cbb-4b9c-bed7-d37c328ddea4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kc82j" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570848 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d27d4386-9536-4d85-877c-3bea1ebae95c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lc54g\" (UID: \"d27d4386-9536-4d85-877c-3bea1ebae95c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc54g" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570886 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n24vl\" (UniqueName: \"kubernetes.io/projected/a62b4517-3937-44e9-8062-982896975a9d-kube-api-access-n24vl\") pod \"openshift-controller-manager-operator-756b6f6bc6-5zdqv\" (UID: \"a62b4517-3937-44e9-8062-982896975a9d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5zdqv" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570923 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4ba76c3-7126-4151-8aaa-5d4aa710a0ea-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-t7zlw\" (UID: \"d4ba76c3-7126-4151-8aaa-5d4aa710a0ea\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-t7zlw" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570954 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh87n\" (UniqueName: \"kubernetes.io/projected/c48142f7-e9a6-49ee-b638-89e198befd03-kube-api-access-fh87n\") pod \"service-ca-9c57cc56f-9n2mx\" (UID: \"c48142f7-e9a6-49ee-b638-89e198befd03\") " pod="openshift-service-ca/service-ca-9c57cc56f-9n2mx" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.570985 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f2052f6-d2cd-4aba-b254-bea0cb7b6aba-service-ca-bundle\") pod \"router-default-5444994796-l44jd\" (UID: \"5f2052f6-d2cd-4aba-b254-bea0cb7b6aba\") " pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.571020 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.571053 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1162eaab-3879-41de-8561-e544952c9b3c-proxy-tls\") pod \"machine-config-controller-84d6567774-ndvfn\" (UID: \"1162eaab-3879-41de-8561-e544952c9b3c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndvfn" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.571086 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cc49fbcc-d2e4-47a3-a1ea-7c414726b20d-srv-cert\") pod \"catalog-operator-68c6474976-c9spn\" (UID: \"cc49fbcc-d2e4-47a3-a1ea-7c414726b20d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9spn" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.571116 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e876542-f5c3-495f-bb1c-72c1136686ac-metrics-tls\") pod \"dns-operator-744455d44c-kbx9t\" (UID: \"2e876542-f5c3-495f-bb1c-72c1136686ac\") " pod="openshift-dns-operator/dns-operator-744455d44c-kbx9t" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.571147 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75dqj\" (UniqueName: \"kubernetes.io/projected/ff715e51-b68c-408d-87d9-7360399c9d9d-kube-api-access-75dqj\") pod \"multus-admission-controller-857f4d67dd-bdwkr\" (UID: \"ff715e51-b68c-408d-87d9-7360399c9d9d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bdwkr" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.571172 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.571194 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/311f8a58-9f7d-4fbb-b6c2-462364f8ad76-plugins-dir\") pod \"csi-hostpathplugin-4rxmf\" (UID: \"311f8a58-9f7d-4fbb-b6c2-462364f8ad76\") " pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.571220 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf0d3295-6cbb-4b9c-bed7-d37c328ddea4-serving-cert\") pod \"openshift-config-operator-7777fb866f-kc82j\" (UID: \"cf0d3295-6cbb-4b9c-bed7-d37c328ddea4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kc82j" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.571248 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.571303 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e31393e0-d2c9-4310-83a3-5958278ea7a2-apiservice-cert\") pod \"packageserver-d55dfcdfc-rlc72\" (UID: \"e31393e0-d2c9-4310-83a3-5958278ea7a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.571327 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/678a6c46-6e4c-4ec0-aa74-7c89c6dc00b5-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mph7x\" (UID: \"678a6c46-6e4c-4ec0-aa74-7c89c6dc00b5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mph7x" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.571352 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbtmp\" (UniqueName: \"kubernetes.io/projected/5f2052f6-d2cd-4aba-b254-bea0cb7b6aba-kube-api-access-nbtmp\") pod \"router-default-5444994796-l44jd\" (UID: \"5f2052f6-d2cd-4aba-b254-bea0cb7b6aba\") " pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.571374 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e950295-ab80-42c7-b1f2-f5869448b824-serving-cert\") pod \"etcd-operator-b45778765-t266c\" (UID: \"3e950295-ab80-42c7-b1f2-f5869448b824\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.571413 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e41b788-2056-4058-89de-1a8cf9885735-bound-sa-token\") pod \"ingress-operator-5b745b69d9-959fr\" (UID: \"6e41b788-2056-4058-89de-1a8cf9885735\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-959fr" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.571438 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c48142f7-e9a6-49ee-b638-89e198befd03-signing-key\") pod \"service-ca-9c57cc56f-9n2mx\" (UID: \"c48142f7-e9a6-49ee-b638-89e198befd03\") " pod="openshift-service-ca/service-ca-9c57cc56f-9n2mx" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.572295 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d00d9bfd-cd31-44f5-8b56-d14af3823d29-ca-trust-extracted\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.572567 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4ba76c3-7126-4151-8aaa-5d4aa710a0ea-service-ca-bundle\") pod \"authentication-operator-69f744f599-t7zlw\" (UID: \"d4ba76c3-7126-4151-8aaa-5d4aa710a0ea\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-t7zlw" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.572827 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1162eaab-3879-41de-8561-e544952c9b3c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ndvfn\" (UID: \"1162eaab-3879-41de-8561-e544952c9b3c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndvfn" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.572868 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ac0e0a1-006e-4a5b-88e8-8cb429978c2f-config\") pod \"console-operator-58897d9998-bnd8f\" (UID: \"6ac0e0a1-006e-4a5b-88e8-8cb429978c2f\") " pod="openshift-console-operator/console-operator-58897d9998-bnd8f" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.573049 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2207687f-bb58-4e3e-b0ca-bc1117a21d91-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-57cnj\" (UID: \"2207687f-bb58-4e3e-b0ca-bc1117a21d91\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57cnj" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.573880 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d00d9bfd-cd31-44f5-8b56-d14af3823d29-registry-certificates\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.574672 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2207687f-bb58-4e3e-b0ca-bc1117a21d91-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-57cnj\" (UID: \"2207687f-bb58-4e3e-b0ca-bc1117a21d91\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57cnj" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.575187 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/cf0d3295-6cbb-4b9c-bed7-d37c328ddea4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kc82j\" (UID: \"cf0d3295-6cbb-4b9c-bed7-d37c328ddea4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kc82j" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.576525 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.576604 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49436ed5-4757-4aa2-92cb-63c65928893a-audit-dir\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.579934 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ac0e0a1-006e-4a5b-88e8-8cb429978c2f-serving-cert\") pod \"console-operator-58897d9998-bnd8f\" (UID: \"6ac0e0a1-006e-4a5b-88e8-8cb429978c2f\") " pod="openshift-console-operator/console-operator-58897d9998-bnd8f" Dec 01 09:10:15 crc kubenswrapper[4867]: E1201 09:10:15.582094 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:16.082054303 +0000 UTC m=+137.541441097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.582391 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4ba76c3-7126-4151-8aaa-5d4aa710a0ea-serving-cert\") pod \"authentication-operator-69f744f599-t7zlw\" (UID: \"d4ba76c3-7126-4151-8aaa-5d4aa710a0ea\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-t7zlw" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.584489 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.585671 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.586431 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf9eddc0-c88c-4527-a1db-f472b220f253-config\") pod \"kube-apiserver-operator-766d6c64bb-h7kf2\" (UID: \"cf9eddc0-c88c-4527-a1db-f472b220f253\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7kf2" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.586967 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.586969 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ac0e0a1-006e-4a5b-88e8-8cb429978c2f-trusted-ca\") pod \"console-operator-58897d9998-bnd8f\" (UID: \"6ac0e0a1-006e-4a5b-88e8-8cb429978c2f\") " pod="openshift-console-operator/console-operator-58897d9998-bnd8f" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.587598 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ba76c3-7126-4151-8aaa-5d4aa710a0ea-config\") pod \"authentication-operator-69f744f599-t7zlw\" (UID: \"d4ba76c3-7126-4151-8aaa-5d4aa710a0ea\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-t7zlw" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.588182 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1162eaab-3879-41de-8561-e544952c9b3c-proxy-tls\") pod \"machine-config-controller-84d6567774-ndvfn\" (UID: \"1162eaab-3879-41de-8561-e544952c9b3c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndvfn" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.588303 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.588471 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e41b788-2056-4058-89de-1a8cf9885735-metrics-tls\") pod \"ingress-operator-5b745b69d9-959fr\" (UID: \"6e41b788-2056-4058-89de-1a8cf9885735\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-959fr" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.588953 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.589060 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d00d9bfd-cd31-44f5-8b56-d14af3823d29-registry-tls\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.589472 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4ba76c3-7126-4151-8aaa-5d4aa710a0ea-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-t7zlw\" (UID: \"d4ba76c3-7126-4151-8aaa-5d4aa710a0ea\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-t7zlw" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.589471 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d00d9bfd-cd31-44f5-8b56-d14af3823d29-trusted-ca\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.589646 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.590939 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de3134ab-8adb-4427-b246-f89bed1610ed-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zq8v5\" (UID: \"de3134ab-8adb-4427-b246-f89bed1610ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zq8v5" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.591120 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.598403 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf0d3295-6cbb-4b9c-bed7-d37c328ddea4-serving-cert\") pod \"openshift-config-operator-7777fb866f-kc82j\" (UID: \"cf0d3295-6cbb-4b9c-bed7-d37c328ddea4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kc82j" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.598420 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf9eddc0-c88c-4527-a1db-f472b220f253-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-h7kf2\" (UID: \"cf9eddc0-c88c-4527-a1db-f472b220f253\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7kf2" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.600059 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.600174 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.605693 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d00d9bfd-cd31-44f5-8b56-d14af3823d29-installation-pull-secrets\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.608206 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs5m7\" (UniqueName: \"kubernetes.io/projected/05fcf7ce-2e60-4fc5-b60b-be4b8b5d4963-kube-api-access-hs5m7\") pod \"migrator-59844c95c7-vsfps\" (UID: \"05fcf7ce-2e60-4fc5-b60b-be4b8b5d4963\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsfps" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.628228 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kmcj\" (UniqueName: \"kubernetes.io/projected/c699ddca-61b5-4f9a-ae7c-48653d9557f8-kube-api-access-9kmcj\") pod \"downloads-7954f5f757-zh8bh\" (UID: \"c699ddca-61b5-4f9a-ae7c-48653d9557f8\") " pod="openshift-console/downloads-7954f5f757-zh8bh" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.637226 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsfps" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672430 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgbrs\" (UniqueName: \"kubernetes.io/projected/a9e0b3a5-0f6d-47cb-a2e7-a09dc8f90d5d-kube-api-access-rgbrs\") pod \"machine-config-server-26zfv\" (UID: \"a9e0b3a5-0f6d-47cb-a2e7-a09dc8f90d5d\") " pod="openshift-machine-config-operator/machine-config-server-26zfv" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672473 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5f2052f6-d2cd-4aba-b254-bea0cb7b6aba-default-certificate\") pod \"router-default-5444994796-l44jd\" (UID: \"5f2052f6-d2cd-4aba-b254-bea0cb7b6aba\") " pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672508 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7d3f2ef-022b-41f5-84e5-6be42f48b023-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6w6sm\" (UID: \"c7d3f2ef-022b-41f5-84e5-6be42f48b023\") " pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672527 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e1b0e60-347c-458c-853a-301c03aeb597-serving-cert\") pod \"service-ca-operator-777779d784-wj7xq\" (UID: \"2e1b0e60-347c-458c-853a-301c03aeb597\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wj7xq" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672561 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45e5dcb3-d55f-40cf-a89f-3367e84322d1-secret-volume\") pod \"collect-profiles-29409660-lcc6h\" (UID: \"45e5dcb3-d55f-40cf-a89f-3367e84322d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672577 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-service-ca\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672591 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3e950295-ab80-42c7-b1f2-f5869448b824-etcd-ca\") pod \"etcd-operator-b45778765-t266c\" (UID: \"3e950295-ab80-42c7-b1f2-f5869448b824\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672608 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/222fb4b4-7d0d-4305-9ba9-9f686dc10dd6-srv-cert\") pod \"olm-operator-6b444d44fb-szthm\" (UID: \"222fb4b4-7d0d-4305-9ba9-9f686dc10dd6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-szthm" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672623 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e950295-ab80-42c7-b1f2-f5869448b824-config\") pod \"etcd-operator-b45778765-t266c\" (UID: \"3e950295-ab80-42c7-b1f2-f5869448b824\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672646 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq7t2\" (UniqueName: \"kubernetes.io/projected/7f7f1347-239a-4608-9972-2a50b5216725-kube-api-access-fq7t2\") pod \"ingress-canary-qq8rb\" (UID: \"7f7f1347-239a-4608-9972-2a50b5216725\") " pod="openshift-ingress-canary/ingress-canary-qq8rb" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672671 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/997ee678-8d54-4f91-af1d-4eefd5006f85-proxy-tls\") pod \"machine-config-operator-74547568cd-rd976\" (UID: \"997ee678-8d54-4f91-af1d-4eefd5006f85\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rd976" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672686 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnvlc\" (UniqueName: \"kubernetes.io/projected/311f8a58-9f7d-4fbb-b6c2-462364f8ad76-kube-api-access-fnvlc\") pod \"csi-hostpathplugin-4rxmf\" (UID: \"311f8a58-9f7d-4fbb-b6c2-462364f8ad76\") " pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672701 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a62b4517-3937-44e9-8062-982896975a9d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5zdqv\" (UID: \"a62b4517-3937-44e9-8062-982896975a9d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5zdqv" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672726 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e31393e0-d2c9-4310-83a3-5958278ea7a2-webhook-cert\") pod \"packageserver-d55dfcdfc-rlc72\" (UID: \"e31393e0-d2c9-4310-83a3-5958278ea7a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672749 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z7p5\" (UniqueName: \"kubernetes.io/projected/c7d3f2ef-022b-41f5-84e5-6be42f48b023-kube-api-access-7z7p5\") pod \"marketplace-operator-79b997595-6w6sm\" (UID: \"c7d3f2ef-022b-41f5-84e5-6be42f48b023\") " pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672770 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44-metrics-tls\") pod \"dns-default-v6frd\" (UID: \"6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44\") " pod="openshift-dns/dns-default-v6frd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672788 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chpmd\" (UniqueName: \"kubernetes.io/projected/e31393e0-d2c9-4310-83a3-5958278ea7a2-kube-api-access-chpmd\") pod \"packageserver-d55dfcdfc-rlc72\" (UID: \"e31393e0-d2c9-4310-83a3-5958278ea7a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672830 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n24vl\" (UniqueName: \"kubernetes.io/projected/a62b4517-3937-44e9-8062-982896975a9d-kube-api-access-n24vl\") pod \"openshift-controller-manager-operator-756b6f6bc6-5zdqv\" (UID: \"a62b4517-3937-44e9-8062-982896975a9d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5zdqv" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672846 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht24r\" (UniqueName: \"kubernetes.io/projected/222fb4b4-7d0d-4305-9ba9-9f686dc10dd6-kube-api-access-ht24r\") pod \"olm-operator-6b444d44fb-szthm\" (UID: \"222fb4b4-7d0d-4305-9ba9-9f686dc10dd6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-szthm" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672862 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9d5feec-4daa-4cea-a996-0a179e42de9f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-r4hs5\" (UID: \"c9d5feec-4daa-4cea-a996-0a179e42de9f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4hs5" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672887 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh87n\" (UniqueName: \"kubernetes.io/projected/c48142f7-e9a6-49ee-b638-89e198befd03-kube-api-access-fh87n\") pod \"service-ca-9c57cc56f-9n2mx\" (UID: \"c48142f7-e9a6-49ee-b638-89e198befd03\") " pod="openshift-service-ca/service-ca-9c57cc56f-9n2mx" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672902 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f2052f6-d2cd-4aba-b254-bea0cb7b6aba-service-ca-bundle\") pod \"router-default-5444994796-l44jd\" (UID: \"5f2052f6-d2cd-4aba-b254-bea0cb7b6aba\") " pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672918 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cc49fbcc-d2e4-47a3-a1ea-7c414726b20d-srv-cert\") pod \"catalog-operator-68c6474976-c9spn\" (UID: \"cc49fbcc-d2e4-47a3-a1ea-7c414726b20d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9spn" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672932 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e876542-f5c3-495f-bb1c-72c1136686ac-metrics-tls\") pod \"dns-operator-744455d44c-kbx9t\" (UID: \"2e876542-f5c3-495f-bb1c-72c1136686ac\") " pod="openshift-dns-operator/dns-operator-744455d44c-kbx9t" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672948 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75dqj\" (UniqueName: \"kubernetes.io/projected/ff715e51-b68c-408d-87d9-7360399c9d9d-kube-api-access-75dqj\") pod \"multus-admission-controller-857f4d67dd-bdwkr\" (UID: \"ff715e51-b68c-408d-87d9-7360399c9d9d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bdwkr" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672967 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/311f8a58-9f7d-4fbb-b6c2-462364f8ad76-plugins-dir\") pod \"csi-hostpathplugin-4rxmf\" (UID: \"311f8a58-9f7d-4fbb-b6c2-462364f8ad76\") " pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.672984 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e31393e0-d2c9-4310-83a3-5958278ea7a2-apiservice-cert\") pod \"packageserver-d55dfcdfc-rlc72\" (UID: \"e31393e0-d2c9-4310-83a3-5958278ea7a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673000 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/678a6c46-6e4c-4ec0-aa74-7c89c6dc00b5-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mph7x\" (UID: \"678a6c46-6e4c-4ec0-aa74-7c89c6dc00b5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mph7x" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673020 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbtmp\" (UniqueName: \"kubernetes.io/projected/5f2052f6-d2cd-4aba-b254-bea0cb7b6aba-kube-api-access-nbtmp\") pod \"router-default-5444994796-l44jd\" (UID: \"5f2052f6-d2cd-4aba-b254-bea0cb7b6aba\") " pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673035 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e950295-ab80-42c7-b1f2-f5869448b824-serving-cert\") pod \"etcd-operator-b45778765-t266c\" (UID: \"3e950295-ab80-42c7-b1f2-f5869448b824\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673054 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c48142f7-e9a6-49ee-b638-89e198befd03-signing-key\") pod \"service-ca-9c57cc56f-9n2mx\" (UID: \"c48142f7-e9a6-49ee-b638-89e198befd03\") " pod="openshift-service-ca/service-ca-9c57cc56f-9n2mx" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673075 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-console-config\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673091 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftvbh\" (UniqueName: \"kubernetes.io/projected/c9d5feec-4daa-4cea-a996-0a179e42de9f-kube-api-access-ftvbh\") pod \"openshift-apiserver-operator-796bbdcf4f-r4hs5\" (UID: \"c9d5feec-4daa-4cea-a996-0a179e42de9f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4hs5" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673106 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a62b4517-3937-44e9-8062-982896975a9d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5zdqv\" (UID: \"a62b4517-3937-44e9-8062-982896975a9d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5zdqv" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673122 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a9e0b3a5-0f6d-47cb-a2e7-a09dc8f90d5d-node-bootstrap-token\") pod \"machine-config-server-26zfv\" (UID: \"a9e0b3a5-0f6d-47cb-a2e7-a09dc8f90d5d\") " pod="openshift-machine-config-operator/machine-config-server-26zfv" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673136 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec06a7ff-9325-4de9-b47e-d8315761bf8d-console-serving-cert\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673150 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4xws\" (UniqueName: \"kubernetes.io/projected/ec06a7ff-9325-4de9-b47e-d8315761bf8d-kube-api-access-q4xws\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673181 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45e5dcb3-d55f-40cf-a89f-3367e84322d1-config-volume\") pod \"collect-profiles-29409660-lcc6h\" (UID: \"45e5dcb3-d55f-40cf-a89f-3367e84322d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673197 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5zjj\" (UniqueName: \"kubernetes.io/projected/2e1b0e60-347c-458c-853a-301c03aeb597-kube-api-access-w5zjj\") pod \"service-ca-operator-777779d784-wj7xq\" (UID: \"2e1b0e60-347c-458c-853a-301c03aeb597\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wj7xq" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673213 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1b0e60-347c-458c-853a-301c03aeb597-config\") pod \"service-ca-operator-777779d784-wj7xq\" (UID: \"2e1b0e60-347c-458c-853a-301c03aeb597\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wj7xq" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673228 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/997ee678-8d54-4f91-af1d-4eefd5006f85-images\") pod \"machine-config-operator-74547568cd-rd976\" (UID: \"997ee678-8d54-4f91-af1d-4eefd5006f85\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rd976" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673241 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5f2052f6-d2cd-4aba-b254-bea0cb7b6aba-stats-auth\") pod \"router-default-5444994796-l44jd\" (UID: \"5f2052f6-d2cd-4aba-b254-bea0cb7b6aba\") " pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673273 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0db10378-99f0-49e1-9148-5a0441652430-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-c6qt4\" (UID: \"0db10378-99f0-49e1-9148-5a0441652430\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c6qt4" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673287 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/311f8a58-9f7d-4fbb-b6c2-462364f8ad76-csi-data-dir\") pod \"csi-hostpathplugin-4rxmf\" (UID: \"311f8a58-9f7d-4fbb-b6c2-462364f8ad76\") " pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673307 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7wdb\" (UniqueName: \"kubernetes.io/projected/cc49fbcc-d2e4-47a3-a1ea-7c414726b20d-kube-api-access-r7wdb\") pod \"catalog-operator-68c6474976-c9spn\" (UID: \"cc49fbcc-d2e4-47a3-a1ea-7c414726b20d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9spn" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673323 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0db10378-99f0-49e1-9148-5a0441652430-config\") pod \"kube-controller-manager-operator-78b949d7b-c6qt4\" (UID: \"0db10378-99f0-49e1-9148-5a0441652430\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c6qt4" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673337 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-oauth-serving-cert\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673359 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3e950295-ab80-42c7-b1f2-f5869448b824-etcd-client\") pod \"etcd-operator-b45778765-t266c\" (UID: \"3e950295-ab80-42c7-b1f2-f5869448b824\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673376 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f7f1347-239a-4608-9972-2a50b5216725-cert\") pod \"ingress-canary-qq8rb\" (UID: \"7f7f1347-239a-4608-9972-2a50b5216725\") " pod="openshift-ingress-canary/ingress-canary-qq8rb" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673389 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/222fb4b4-7d0d-4305-9ba9-9f686dc10dd6-profile-collector-cert\") pod \"olm-operator-6b444d44fb-szthm\" (UID: \"222fb4b4-7d0d-4305-9ba9-9f686dc10dd6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-szthm" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673403 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-trusted-ca-bundle\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673433 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/311f8a58-9f7d-4fbb-b6c2-462364f8ad76-mountpoint-dir\") pod \"csi-hostpathplugin-4rxmf\" (UID: \"311f8a58-9f7d-4fbb-b6c2-462364f8ad76\") " pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673448 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec06a7ff-9325-4de9-b47e-d8315761bf8d-console-oauth-config\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673464 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e950295-ab80-42c7-b1f2-f5869448b824-etcd-service-ca\") pod \"etcd-operator-b45778765-t266c\" (UID: \"3e950295-ab80-42c7-b1f2-f5869448b824\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673480 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfx2j\" (UniqueName: \"kubernetes.io/projected/6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44-kube-api-access-rfx2j\") pod \"dns-default-v6frd\" (UID: \"6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44\") " pod="openshift-dns/dns-default-v6frd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673496 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673513 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/997ee678-8d54-4f91-af1d-4eefd5006f85-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rd976\" (UID: \"997ee678-8d54-4f91-af1d-4eefd5006f85\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rd976" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673528 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e31393e0-d2c9-4310-83a3-5958278ea7a2-tmpfs\") pod \"packageserver-d55dfcdfc-rlc72\" (UID: \"e31393e0-d2c9-4310-83a3-5958278ea7a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673542 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24xwc\" (UniqueName: \"kubernetes.io/projected/2e876542-f5c3-495f-bb1c-72c1136686ac-kube-api-access-24xwc\") pod \"dns-operator-744455d44c-kbx9t\" (UID: \"2e876542-f5c3-495f-bb1c-72c1136686ac\") " pod="openshift-dns-operator/dns-operator-744455d44c-kbx9t" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673557 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c48142f7-e9a6-49ee-b638-89e198befd03-signing-cabundle\") pod \"service-ca-9c57cc56f-9n2mx\" (UID: \"c48142f7-e9a6-49ee-b638-89e198befd03\") " pod="openshift-service-ca/service-ca-9c57cc56f-9n2mx" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673571 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9mnf\" (UniqueName: \"kubernetes.io/projected/45e5dcb3-d55f-40cf-a89f-3367e84322d1-kube-api-access-k9mnf\") pod \"collect-profiles-29409660-lcc6h\" (UID: \"45e5dcb3-d55f-40cf-a89f-3367e84322d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673587 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff715e51-b68c-408d-87d9-7360399c9d9d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bdwkr\" (UID: \"ff715e51-b68c-408d-87d9-7360399c9d9d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bdwkr" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673607 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7d3f2ef-022b-41f5-84e5-6be42f48b023-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6w6sm\" (UID: \"c7d3f2ef-022b-41f5-84e5-6be42f48b023\") " pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673639 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44-config-volume\") pod \"dns-default-v6frd\" (UID: \"6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44\") " pod="openshift-dns/dns-default-v6frd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673658 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a9e0b3a5-0f6d-47cb-a2e7-a09dc8f90d5d-certs\") pod \"machine-config-server-26zfv\" (UID: \"a9e0b3a5-0f6d-47cb-a2e7-a09dc8f90d5d\") " pod="openshift-machine-config-operator/machine-config-server-26zfv" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673674 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq7ff\" (UniqueName: \"kubernetes.io/projected/997ee678-8d54-4f91-af1d-4eefd5006f85-kube-api-access-bq7ff\") pod \"machine-config-operator-74547568cd-rd976\" (UID: \"997ee678-8d54-4f91-af1d-4eefd5006f85\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rd976" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673688 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9d5feec-4daa-4cea-a996-0a179e42de9f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-r4hs5\" (UID: \"c9d5feec-4daa-4cea-a996-0a179e42de9f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4hs5" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673704 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0db10378-99f0-49e1-9148-5a0441652430-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-c6qt4\" (UID: \"0db10378-99f0-49e1-9148-5a0441652430\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c6qt4" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673721 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/311f8a58-9f7d-4fbb-b6c2-462364f8ad76-registration-dir\") pod \"csi-hostpathplugin-4rxmf\" (UID: \"311f8a58-9f7d-4fbb-b6c2-462364f8ad76\") " pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673739 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cc49fbcc-d2e4-47a3-a1ea-7c414726b20d-profile-collector-cert\") pod \"catalog-operator-68c6474976-c9spn\" (UID: \"cc49fbcc-d2e4-47a3-a1ea-7c414726b20d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9spn" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673761 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/311f8a58-9f7d-4fbb-b6c2-462364f8ad76-socket-dir\") pod \"csi-hostpathplugin-4rxmf\" (UID: \"311f8a58-9f7d-4fbb-b6c2-462364f8ad76\") " pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673779 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtnn6\" (UniqueName: \"kubernetes.io/projected/678a6c46-6e4c-4ec0-aa74-7c89c6dc00b5-kube-api-access-wtnn6\") pod \"control-plane-machine-set-operator-78cbb6b69f-mph7x\" (UID: \"678a6c46-6e4c-4ec0-aa74-7c89c6dc00b5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mph7x" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673795 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlbxt\" (UniqueName: \"kubernetes.io/projected/3e950295-ab80-42c7-b1f2-f5869448b824-kube-api-access-nlbxt\") pod \"etcd-operator-b45778765-t266c\" (UID: \"3e950295-ab80-42c7-b1f2-f5869448b824\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.673826 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f2052f6-d2cd-4aba-b254-bea0cb7b6aba-metrics-certs\") pod \"router-default-5444994796-l44jd\" (UID: \"5f2052f6-d2cd-4aba-b254-bea0cb7b6aba\") " pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.676732 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45e5dcb3-d55f-40cf-a89f-3367e84322d1-config-volume\") pod \"collect-profiles-29409660-lcc6h\" (UID: \"45e5dcb3-d55f-40cf-a89f-3367e84322d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.677126 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/997ee678-8d54-4f91-af1d-4eefd5006f85-images\") pod \"machine-config-operator-74547568cd-rd976\" (UID: \"997ee678-8d54-4f91-af1d-4eefd5006f85\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rd976" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.677223 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f2052f6-d2cd-4aba-b254-bea0cb7b6aba-metrics-certs\") pod \"router-default-5444994796-l44jd\" (UID: \"5f2052f6-d2cd-4aba-b254-bea0cb7b6aba\") " pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.678397 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c48142f7-e9a6-49ee-b638-89e198befd03-signing-cabundle\") pod \"service-ca-9c57cc56f-9n2mx\" (UID: \"c48142f7-e9a6-49ee-b638-89e198befd03\") " pod="openshift-service-ca/service-ca-9c57cc56f-9n2mx" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.678586 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/311f8a58-9f7d-4fbb-b6c2-462364f8ad76-plugins-dir\") pod \"csi-hostpathplugin-4rxmf\" (UID: \"311f8a58-9f7d-4fbb-b6c2-462364f8ad76\") " pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.679095 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9d5feec-4daa-4cea-a996-0a179e42de9f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-r4hs5\" (UID: \"c9d5feec-4daa-4cea-a996-0a179e42de9f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4hs5" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.679430 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec06a7ff-9325-4de9-b47e-d8315761bf8d-console-serving-cert\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.679889 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f2052f6-d2cd-4aba-b254-bea0cb7b6aba-service-ca-bundle\") pod \"router-default-5444994796-l44jd\" (UID: \"5f2052f6-d2cd-4aba-b254-bea0cb7b6aba\") " pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.680539 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3e950295-ab80-42c7-b1f2-f5869448b824-etcd-ca\") pod \"etcd-operator-b45778765-t266c\" (UID: \"3e950295-ab80-42c7-b1f2-f5869448b824\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.680793 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5f2052f6-d2cd-4aba-b254-bea0cb7b6aba-default-certificate\") pod \"router-default-5444994796-l44jd\" (UID: \"5f2052f6-d2cd-4aba-b254-bea0cb7b6aba\") " pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.682003 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e950295-ab80-42c7-b1f2-f5869448b824-config\") pod \"etcd-operator-b45778765-t266c\" (UID: \"3e950295-ab80-42c7-b1f2-f5869448b824\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.682402 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7d3f2ef-022b-41f5-84e5-6be42f48b023-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6w6sm\" (UID: \"c7d3f2ef-022b-41f5-84e5-6be42f48b023\") " pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.682874 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/311f8a58-9f7d-4fbb-b6c2-462364f8ad76-csi-data-dir\") pod \"csi-hostpathplugin-4rxmf\" (UID: \"311f8a58-9f7d-4fbb-b6c2-462364f8ad76\") " pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.683293 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e876542-f5c3-495f-bb1c-72c1136686ac-metrics-tls\") pod \"dns-operator-744455d44c-kbx9t\" (UID: \"2e876542-f5c3-495f-bb1c-72c1136686ac\") " pod="openshift-dns-operator/dns-operator-744455d44c-kbx9t" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.683483 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0db10378-99f0-49e1-9148-5a0441652430-config\") pod \"kube-controller-manager-operator-78b949d7b-c6qt4\" (UID: \"0db10378-99f0-49e1-9148-5a0441652430\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c6qt4" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.684159 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-oauth-serving-cert\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.687695 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44-config-volume\") pod \"dns-default-v6frd\" (UID: \"6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44\") " pod="openshift-dns/dns-default-v6frd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.687788 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/311f8a58-9f7d-4fbb-b6c2-462364f8ad76-registration-dir\") pod \"csi-hostpathplugin-4rxmf\" (UID: \"311f8a58-9f7d-4fbb-b6c2-462364f8ad76\") " pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.688025 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7d3f2ef-022b-41f5-84e5-6be42f48b023-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6w6sm\" (UID: \"c7d3f2ef-022b-41f5-84e5-6be42f48b023\") " pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.688335 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/311f8a58-9f7d-4fbb-b6c2-462364f8ad76-socket-dir\") pod \"csi-hostpathplugin-4rxmf\" (UID: \"311f8a58-9f7d-4fbb-b6c2-462364f8ad76\") " pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.688445 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/311f8a58-9f7d-4fbb-b6c2-462364f8ad76-mountpoint-dir\") pod \"csi-hostpathplugin-4rxmf\" (UID: \"311f8a58-9f7d-4fbb-b6c2-462364f8ad76\") " pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.688554 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a62b4517-3937-44e9-8062-982896975a9d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5zdqv\" (UID: \"a62b4517-3937-44e9-8062-982896975a9d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5zdqv" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.689173 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e950295-ab80-42c7-b1f2-f5869448b824-etcd-service-ca\") pod \"etcd-operator-b45778765-t266c\" (UID: \"3e950295-ab80-42c7-b1f2-f5869448b824\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" Dec 01 09:10:15 crc kubenswrapper[4867]: E1201 09:10:15.689886 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:16.189868604 +0000 UTC m=+137.649255458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.702357 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e950295-ab80-42c7-b1f2-f5869448b824-serving-cert\") pod \"etcd-operator-b45778765-t266c\" (UID: \"3e950295-ab80-42c7-b1f2-f5869448b824\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.703017 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44-metrics-tls\") pod \"dns-default-v6frd\" (UID: \"6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44\") " pod="openshift-dns/dns-default-v6frd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.702992 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/997ee678-8d54-4f91-af1d-4eefd5006f85-proxy-tls\") pod \"machine-config-operator-74547568cd-rd976\" (UID: \"997ee678-8d54-4f91-af1d-4eefd5006f85\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rd976" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.703158 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9d5feec-4daa-4cea-a996-0a179e42de9f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-r4hs5\" (UID: \"c9d5feec-4daa-4cea-a996-0a179e42de9f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4hs5" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.702499 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a62b4517-3937-44e9-8062-982896975a9d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5zdqv\" (UID: \"a62b4517-3937-44e9-8062-982896975a9d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5zdqv" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.703363 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3e950295-ab80-42c7-b1f2-f5869448b824-etcd-client\") pod \"etcd-operator-b45778765-t266c\" (UID: \"3e950295-ab80-42c7-b1f2-f5869448b824\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.704270 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/678a6c46-6e4c-4ec0-aa74-7c89c6dc00b5-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mph7x\" (UID: \"678a6c46-6e4c-4ec0-aa74-7c89c6dc00b5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mph7x" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.727654 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-service-ca\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.728121 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/997ee678-8d54-4f91-af1d-4eefd5006f85-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rd976\" (UID: \"997ee678-8d54-4f91-af1d-4eefd5006f85\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rd976" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.728206 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.728484 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1b0e60-347c-458c-853a-301c03aeb597-config\") pod \"service-ca-operator-777779d784-wj7xq\" (UID: \"2e1b0e60-347c-458c-853a-301c03aeb597\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wj7xq" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.728652 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cc49fbcc-d2e4-47a3-a1ea-7c414726b20d-srv-cert\") pod \"catalog-operator-68c6474976-c9spn\" (UID: \"cc49fbcc-d2e4-47a3-a1ea-7c414726b20d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9spn" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.728849 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-audit-policies\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.729856 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d27d4386-9536-4d85-877c-3bea1ebae95c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lc54g\" (UID: \"d27d4386-9536-4d85-877c-3bea1ebae95c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc54g" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.728500 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f7f1347-239a-4608-9972-2a50b5216725-cert\") pod \"ingress-canary-qq8rb\" (UID: \"7f7f1347-239a-4608-9972-2a50b5216725\") " pod="openshift-ingress-canary/ingress-canary-qq8rb" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.729944 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/222fb4b4-7d0d-4305-9ba9-9f686dc10dd6-srv-cert\") pod \"olm-operator-6b444d44fb-szthm\" (UID: \"222fb4b4-7d0d-4305-9ba9-9f686dc10dd6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-szthm" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.730363 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cc49fbcc-d2e4-47a3-a1ea-7c414726b20d-profile-collector-cert\") pod \"catalog-operator-68c6474976-c9spn\" (UID: \"cc49fbcc-d2e4-47a3-a1ea-7c414726b20d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9spn" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.730582 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e41b788-2056-4058-89de-1a8cf9885735-trusted-ca\") pod \"ingress-operator-5b745b69d9-959fr\" (UID: \"6e41b788-2056-4058-89de-1a8cf9885735\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-959fr" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.730650 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e31393e0-d2c9-4310-83a3-5958278ea7a2-tmpfs\") pod \"packageserver-d55dfcdfc-rlc72\" (UID: \"e31393e0-d2c9-4310-83a3-5958278ea7a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.731037 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-console-config\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.731946 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c48142f7-e9a6-49ee-b638-89e198befd03-signing-key\") pod \"service-ca-9c57cc56f-9n2mx\" (UID: \"c48142f7-e9a6-49ee-b638-89e198befd03\") " pod="openshift-service-ca/service-ca-9c57cc56f-9n2mx" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.733370 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-trusted-ca-bundle\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.733560 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0db10378-99f0-49e1-9148-5a0441652430-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-c6qt4\" (UID: \"0db10378-99f0-49e1-9148-5a0441652430\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c6qt4" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.733942 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e41b788-2056-4058-89de-1a8cf9885735-bound-sa-token\") pod \"ingress-operator-5b745b69d9-959fr\" (UID: \"6e41b788-2056-4058-89de-1a8cf9885735\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-959fr" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.734204 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a9e0b3a5-0f6d-47cb-a2e7-a09dc8f90d5d-certs\") pod \"machine-config-server-26zfv\" (UID: \"a9e0b3a5-0f6d-47cb-a2e7-a09dc8f90d5d\") " pod="openshift-machine-config-operator/machine-config-server-26zfv" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.734407 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5wc5\" (UniqueName: \"kubernetes.io/projected/6ac0e0a1-006e-4a5b-88e8-8cb429978c2f-kube-api-access-n5wc5\") pod \"console-operator-58897d9998-bnd8f\" (UID: \"6ac0e0a1-006e-4a5b-88e8-8cb429978c2f\") " pod="openshift-console-operator/console-operator-58897d9998-bnd8f" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.735099 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff715e51-b68c-408d-87d9-7360399c9d9d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bdwkr\" (UID: \"ff715e51-b68c-408d-87d9-7360399c9d9d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bdwkr" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.737352 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec06a7ff-9325-4de9-b47e-d8315761bf8d-console-oauth-config\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.737611 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de3134ab-8adb-4427-b246-f89bed1610ed-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zq8v5\" (UID: \"de3134ab-8adb-4427-b246-f89bed1610ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zq8v5" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.738271 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e31393e0-d2c9-4310-83a3-5958278ea7a2-webhook-cert\") pod \"packageserver-d55dfcdfc-rlc72\" (UID: \"e31393e0-d2c9-4310-83a3-5958278ea7a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.740155 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e1b0e60-347c-458c-853a-301c03aeb597-serving-cert\") pod \"service-ca-operator-777779d784-wj7xq\" (UID: \"2e1b0e60-347c-458c-853a-301c03aeb597\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wj7xq" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.745489 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a9e0b3a5-0f6d-47cb-a2e7-a09dc8f90d5d-node-bootstrap-token\") pod \"machine-config-server-26zfv\" (UID: \"a9e0b3a5-0f6d-47cb-a2e7-a09dc8f90d5d\") " pod="openshift-machine-config-operator/machine-config-server-26zfv" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.746339 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45e5dcb3-d55f-40cf-a89f-3367e84322d1-secret-volume\") pod \"collect-profiles-29409660-lcc6h\" (UID: \"45e5dcb3-d55f-40cf-a89f-3367e84322d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.746454 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf9eddc0-c88c-4527-a1db-f472b220f253-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-h7kf2\" (UID: \"cf9eddc0-c88c-4527-a1db-f472b220f253\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7kf2" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.746784 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/222fb4b4-7d0d-4305-9ba9-9f686dc10dd6-profile-collector-cert\") pod \"olm-operator-6b444d44fb-szthm\" (UID: \"222fb4b4-7d0d-4305-9ba9-9f686dc10dd6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-szthm" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.747760 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e31393e0-d2c9-4310-83a3-5958278ea7a2-apiservice-cert\") pod \"packageserver-d55dfcdfc-rlc72\" (UID: \"e31393e0-d2c9-4310-83a3-5958278ea7a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.747953 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bmvq\" (UniqueName: \"kubernetes.io/projected/d27d4386-9536-4d85-877c-3bea1ebae95c-kube-api-access-7bmvq\") pod \"package-server-manager-789f6589d5-lc54g\" (UID: \"d27d4386-9536-4d85-877c-3bea1ebae95c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc54g" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.750225 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5f2052f6-d2cd-4aba-b254-bea0cb7b6aba-stats-auth\") pod \"router-default-5444994796-l44jd\" (UID: \"5f2052f6-d2cd-4aba-b254-bea0cb7b6aba\") " pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.752031 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78zsn\" (UniqueName: \"kubernetes.io/projected/1162eaab-3879-41de-8561-e544952c9b3c-kube-api-access-78zsn\") pod \"machine-config-controller-84d6567774-ndvfn\" (UID: \"1162eaab-3879-41de-8561-e544952c9b3c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndvfn" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.764384 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kr7z\" (UniqueName: \"kubernetes.io/projected/cf0d3295-6cbb-4b9c-bed7-d37c328ddea4-kube-api-access-2kr7z\") pod \"openshift-config-operator-7777fb866f-kc82j\" (UID: \"cf0d3295-6cbb-4b9c-bed7-d37c328ddea4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kc82j" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.775275 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:15 crc kubenswrapper[4867]: E1201 09:10:15.775741 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:16.275724761 +0000 UTC m=+137.735111515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.814057 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95dlj\" (UniqueName: \"kubernetes.io/projected/d00d9bfd-cd31-44f5-8b56-d14af3823d29-kube-api-access-95dlj\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.824418 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnv9l\" (UniqueName: \"kubernetes.io/projected/49436ed5-4757-4aa2-92cb-63c65928893a-kube-api-access-hnv9l\") pod \"oauth-openshift-558db77b4-p989q\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.824638 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.847208 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7kf2" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.858413 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8d96\" (UniqueName: \"kubernetes.io/projected/d4ba76c3-7126-4151-8aaa-5d4aa710a0ea-kube-api-access-k8d96\") pod \"authentication-operator-69f744f599-t7zlw\" (UID: \"d4ba76c3-7126-4151-8aaa-5d4aa710a0ea\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-t7zlw" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.863087 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc54g" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.867215 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpczg\" (UniqueName: \"kubernetes.io/projected/6e41b788-2056-4058-89de-1a8cf9885735-kube-api-access-jpczg\") pod \"ingress-operator-5b745b69d9-959fr\" (UID: \"6e41b788-2056-4058-89de-1a8cf9885735\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-959fr" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.876403 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: E1201 09:10:15.876789 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:16.376773973 +0000 UTC m=+137.836160737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.880120 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndvfn" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.885338 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d00d9bfd-cd31-44f5-8b56-d14af3823d29-bound-sa-token\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.906574 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hj6l\" (UniqueName: \"kubernetes.io/projected/2207687f-bb58-4e3e-b0ca-bc1117a21d91-kube-api-access-7hj6l\") pod \"kube-storage-version-migrator-operator-b67b599dd-57cnj\" (UID: \"2207687f-bb58-4e3e-b0ca-bc1117a21d91\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57cnj" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.919088 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bnd8f" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.926359 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zh8bh" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.928064 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgbrs\" (UniqueName: \"kubernetes.io/projected/a9e0b3a5-0f6d-47cb-a2e7-a09dc8f90d5d-kube-api-access-rgbrs\") pod \"machine-config-server-26zfv\" (UID: \"a9e0b3a5-0f6d-47cb-a2e7-a09dc8f90d5d\") " pod="openshift-machine-config-operator/machine-config-server-26zfv" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.957982 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.962902 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4xws\" (UniqueName: \"kubernetes.io/projected/ec06a7ff-9325-4de9-b47e-d8315761bf8d-kube-api-access-q4xws\") pod \"console-f9d7485db-kdm2m\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.972200 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5zjj\" (UniqueName: \"kubernetes.io/projected/2e1b0e60-347c-458c-853a-301c03aeb597-kube-api-access-w5zjj\") pod \"service-ca-operator-777779d784-wj7xq\" (UID: \"2e1b0e60-347c-458c-853a-301c03aeb597\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wj7xq" Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.977714 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:15 crc kubenswrapper[4867]: E1201 09:10:15.978257 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:16.478241658 +0000 UTC m=+137.937628412 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:15 crc kubenswrapper[4867]: I1201 09:10:15.988271 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht24r\" (UniqueName: \"kubernetes.io/projected/222fb4b4-7d0d-4305-9ba9-9f686dc10dd6-kube-api-access-ht24r\") pod \"olm-operator-6b444d44fb-szthm\" (UID: \"222fb4b4-7d0d-4305-9ba9-9f686dc10dd6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-szthm" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:15.999773 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zq8v5" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.005905 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24xwc\" (UniqueName: \"kubernetes.io/projected/2e876542-f5c3-495f-bb1c-72c1136686ac-kube-api-access-24xwc\") pod \"dns-operator-744455d44c-kbx9t\" (UID: \"2e876542-f5c3-495f-bb1c-72c1136686ac\") " pod="openshift-dns-operator/dns-operator-744455d44c-kbx9t" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.009542 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-26zfv" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.012756 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vsfps"] Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.030975 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh87n\" (UniqueName: \"kubernetes.io/projected/c48142f7-e9a6-49ee-b638-89e198befd03-kube-api-access-fh87n\") pod \"service-ca-9c57cc56f-9n2mx\" (UID: \"c48142f7-e9a6-49ee-b638-89e198befd03\") " pod="openshift-service-ca/service-ca-9c57cc56f-9n2mx" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.046941 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75dqj\" (UniqueName: \"kubernetes.io/projected/ff715e51-b68c-408d-87d9-7360399c9d9d-kube-api-access-75dqj\") pod \"multus-admission-controller-857f4d67dd-bdwkr\" (UID: \"ff715e51-b68c-408d-87d9-7360399c9d9d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bdwkr" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.057671 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kc82j" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.073053 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9mnf\" (UniqueName: \"kubernetes.io/projected/45e5dcb3-d55f-40cf-a89f-3367e84322d1-kube-api-access-k9mnf\") pod \"collect-profiles-29409660-lcc6h\" (UID: \"45e5dcb3-d55f-40cf-a89f-3367e84322d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.082585 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:16 crc kubenswrapper[4867]: E1201 09:10:16.082970 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:16.582955898 +0000 UTC m=+138.042342652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.112339 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z7p5\" (UniqueName: \"kubernetes.io/projected/c7d3f2ef-022b-41f5-84e5-6be42f48b023-kube-api-access-7z7p5\") pod \"marketplace-operator-79b997595-6w6sm\" (UID: \"c7d3f2ef-022b-41f5-84e5-6be42f48b023\") " pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.114197 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq7t2\" (UniqueName: \"kubernetes.io/projected/7f7f1347-239a-4608-9972-2a50b5216725-kube-api-access-fq7t2\") pod \"ingress-canary-qq8rb\" (UID: \"7f7f1347-239a-4608-9972-2a50b5216725\") " pod="openshift-ingress-canary/ingress-canary-qq8rb" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.141078 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-t7zlw" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.148300 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0db10378-99f0-49e1-9148-5a0441652430-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-c6qt4\" (UID: \"0db10378-99f0-49e1-9148-5a0441652430\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c6qt4" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.161043 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-959fr" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.171239 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57cnj" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.174315 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chpmd\" (UniqueName: \"kubernetes.io/projected/e31393e0-d2c9-4310-83a3-5958278ea7a2-kube-api-access-chpmd\") pod \"packageserver-d55dfcdfc-rlc72\" (UID: \"e31393e0-d2c9-4310-83a3-5958278ea7a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.177050 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-szthm" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.191249 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:16 crc kubenswrapper[4867]: E1201 09:10:16.191645 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:16.691630074 +0000 UTC m=+138.151016818 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.193551 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnvlc\" (UniqueName: \"kubernetes.io/projected/311f8a58-9f7d-4fbb-b6c2-462364f8ad76-kube-api-access-fnvlc\") pod \"csi-hostpathplugin-4rxmf\" (UID: \"311f8a58-9f7d-4fbb-b6c2-462364f8ad76\") " pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.194098 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.213521 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n24vl\" (UniqueName: \"kubernetes.io/projected/a62b4517-3937-44e9-8062-982896975a9d-kube-api-access-n24vl\") pod \"openshift-controller-manager-operator-756b6f6bc6-5zdqv\" (UID: \"a62b4517-3937-44e9-8062-982896975a9d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5zdqv" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.216084 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7wdb\" (UniqueName: \"kubernetes.io/projected/cc49fbcc-d2e4-47a3-a1ea-7c414726b20d-kube-api-access-r7wdb\") pod \"catalog-operator-68c6474976-c9spn\" (UID: \"cc49fbcc-d2e4-47a3-a1ea-7c414726b20d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9spn" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.246109 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wj7xq" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.247128 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.248507 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.250244 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.250586 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p989q"] Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.259341 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-bdwkr" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.260069 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftvbh\" (UniqueName: \"kubernetes.io/projected/c9d5feec-4daa-4cea-a996-0a179e42de9f-kube-api-access-ftvbh\") pod \"openshift-apiserver-operator-796bbdcf4f-r4hs5\" (UID: \"c9d5feec-4daa-4cea-a996-0a179e42de9f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4hs5" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.264068 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9n2mx" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.271613 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9spn" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.276350 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbtmp\" (UniqueName: \"kubernetes.io/projected/5f2052f6-d2cd-4aba-b254-bea0cb7b6aba-kube-api-access-nbtmp\") pod \"router-default-5444994796-l44jd\" (UID: \"5f2052f6-d2cd-4aba-b254-bea0cb7b6aba\") " pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.280239 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.287049 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c6qt4" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.288303 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7kf2"] Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.290417 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc54g"] Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.291355 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq7ff\" (UniqueName: \"kubernetes.io/projected/997ee678-8d54-4f91-af1d-4eefd5006f85-kube-api-access-bq7ff\") pod \"machine-config-operator-74547568cd-rd976\" (UID: \"997ee678-8d54-4f91-af1d-4eefd5006f85\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rd976" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.292510 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:16 crc kubenswrapper[4867]: E1201 09:10:16.292862 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:16.792850961 +0000 UTC m=+138.252237715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.300600 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtnn6\" (UniqueName: \"kubernetes.io/projected/678a6c46-6e4c-4ec0-aa74-7c89c6dc00b5-kube-api-access-wtnn6\") pod \"control-plane-machine-set-operator-78cbb6b69f-mph7x\" (UID: \"678a6c46-6e4c-4ec0-aa74-7c89c6dc00b5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mph7x" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.307510 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-kbx9t" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.317081 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlbxt\" (UniqueName: \"kubernetes.io/projected/3e950295-ab80-42c7-b1f2-f5869448b824-kube-api-access-nlbxt\") pod \"etcd-operator-b45778765-t266c\" (UID: \"3e950295-ab80-42c7-b1f2-f5869448b824\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.336414 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.341528 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qq8rb" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.393138 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:16 crc kubenswrapper[4867]: E1201 09:10:16.394212 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:16.894186083 +0000 UTC m=+138.353572927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.407662 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfx2j\" (UniqueName: \"kubernetes.io/projected/6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44-kube-api-access-rfx2j\") pod \"dns-default-v6frd\" (UID: \"6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44\") " pod="openshift-dns/dns-default-v6frd" Dec 01 09:10:16 crc kubenswrapper[4867]: W1201 09:10:16.408261 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd27d4386_9536_4d85_877c_3bea1ebae95c.slice/crio-ca074d5249ec4bd6ef3f103280df9d942f1efc55f921b30e6b956960a9e3d441 WatchSource:0}: Error finding container ca074d5249ec4bd6ef3f103280df9d942f1efc55f921b30e6b956960a9e3d441: Status 404 returned error can't find the container with id ca074d5249ec4bd6ef3f103280df9d942f1efc55f921b30e6b956960a9e3d441 Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.476282 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ndvfn"] Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.486723 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.494494 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:16 crc kubenswrapper[4867]: E1201 09:10:16.494779 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:16.994765961 +0000 UTC m=+138.454152715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.499197 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mph7x" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.507029 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5zdqv" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.513172 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rd976" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.519293 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.527540 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7blkm" event={"ID":"56d53753-56f3-40fc-ba22-f635027ed42d","Type":"ContainerStarted","Data":"38047a930715d85448f081019c9842302f4c0b229b75080678bb7de450d5c391"} Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.529240 4867 generic.go:334] "Generic (PLEG): container finished" podID="8b075132-2629-49ad-9361-42fe48ae5b57" containerID="5bebeeeb704c9253d4a5bfeb755855bba0430f0b1606c01cbd8108c5a796a8ae" exitCode=0 Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.529645 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" event={"ID":"8b075132-2629-49ad-9361-42fe48ae5b57","Type":"ContainerDied","Data":"5bebeeeb704c9253d4a5bfeb755855bba0430f0b1606c01cbd8108c5a796a8ae"} Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.541407 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4hs5" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.576051 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bnd8f"] Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.596384 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v6frd" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.596943 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:16 crc kubenswrapper[4867]: E1201 09:10:16.597277 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:17.097257336 +0000 UTC m=+138.556644090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.599085 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-26zfv" event={"ID":"a9e0b3a5-0f6d-47cb-a2e7-a09dc8f90d5d","Type":"ContainerStarted","Data":"c3c77164e0e40c845490c58f47205413bdae83029ab13ce29458b93a4caa9821"} Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.601726 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsfps" event={"ID":"05fcf7ce-2e60-4fc5-b60b-be4b8b5d4963","Type":"ContainerStarted","Data":"dc0c2f5f914e32ee611985a2a9f219b1e17f28d8e6e24ed7eff438c2c311a840"} Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.614006 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc54g" event={"ID":"d27d4386-9536-4d85-877c-3bea1ebae95c","Type":"ContainerStarted","Data":"ca074d5249ec4bd6ef3f103280df9d942f1efc55f921b30e6b956960a9e3d441"} Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.630428 4867 generic.go:334] "Generic (PLEG): container finished" podID="8e6598f7-031a-4561-bde4-ae61121a17cd" containerID="f86c03ba590ae74a76d687459b79ec19db0d27302405136f1cc66f845c386739" exitCode=0 Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.630530 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" event={"ID":"8e6598f7-031a-4561-bde4-ae61121a17cd","Type":"ContainerDied","Data":"f86c03ba590ae74a76d687459b79ec19db0d27302405136f1cc66f845c386739"} Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.670100 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zq8v5"] Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.676750 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7sqxq" event={"ID":"5e1caefa-d624-4351-89a3-d8c33a7924d6","Type":"ContainerStarted","Data":"28afe2da2db48c4c48e3f07441d37032814bb67d95ffdef1ad4a100208bee887"} Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.685855 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7kf2" event={"ID":"cf9eddc0-c88c-4527-a1db-f472b220f253","Type":"ContainerStarted","Data":"f560230245b7a81361776fe913ce1e3613b96166dd5b19b9ae8758b5eab28225"} Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.687912 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xd8vg" event={"ID":"61de6f28-6c9e-4175-a1e3-e29e8fed45f6","Type":"ContainerStarted","Data":"5bf33a7da7f5c2803a50acb2c165fcdde032dbbe0e27432d02706c94fceca5d4"} Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.696346 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" event={"ID":"bc693be6-558a-41e9-96cd-40061ff9ae5d","Type":"ContainerStarted","Data":"b23ee4aeebdadf10b06973288177e3a60db71204a2cd0d0fee5cb80ea87c28a4"} Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.698441 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:16 crc kubenswrapper[4867]: E1201 09:10:16.699323 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:17.199307908 +0000 UTC m=+138.658694652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.699630 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.717229 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p989q" event={"ID":"49436ed5-4757-4aa2-92cb-63c65928893a","Type":"ContainerStarted","Data":"1f844ff10a7ef19edfe609b1c457b6b67e4b87487df4208d5f3756048a4bd8be"} Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.728931 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.755939 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zh8bh"] Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.800650 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:16 crc kubenswrapper[4867]: E1201 09:10:16.801486 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:17.301469593 +0000 UTC m=+138.760856347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.816390 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kc82j"] Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.903169 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:16 crc kubenswrapper[4867]: E1201 09:10:16.903627 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:17.403606657 +0000 UTC m=+138.862993411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:16 crc kubenswrapper[4867]: I1201 09:10:16.904847 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72"] Dec 01 09:10:16 crc kubenswrapper[4867]: W1201 09:10:16.908762 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ac0e0a1_006e_4a5b_88e8_8cb429978c2f.slice/crio-e194ca652c20893da3d6409ac43c8f671322b370b484d70e0094c494a3b75011 WatchSource:0}: Error finding container e194ca652c20893da3d6409ac43c8f671322b370b484d70e0094c494a3b75011: Status 404 returned error can't find the container with id e194ca652c20893da3d6409ac43c8f671322b370b484d70e0094c494a3b75011 Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.007964 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:17 crc kubenswrapper[4867]: E1201 09:10:17.008368 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:17.508338627 +0000 UTC m=+138.967725391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.008512 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:17 crc kubenswrapper[4867]: E1201 09:10:17.009033 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:17.509021657 +0000 UTC m=+138.968408411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.039619 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-545ws" podStartSLOduration=119.039599234 podStartE2EDuration="1m59.039599234s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:17.03876839 +0000 UTC m=+138.498155144" watchObservedRunningTime="2025-12-01 09:10:17.039599234 +0000 UTC m=+138.498985998" Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.091395 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57cnj"] Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.109384 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:17 crc kubenswrapper[4867]: E1201 09:10:17.109779 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:17.609759521 +0000 UTC m=+139.069146275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.210910 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:17 crc kubenswrapper[4867]: E1201 09:10:17.211329 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:17.711318588 +0000 UTC m=+139.170705342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.228282 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7sqxq" podStartSLOduration=120.228262504 podStartE2EDuration="2m0.228262504s" podCreationTimestamp="2025-12-01 09:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:17.194257698 +0000 UTC m=+138.653644452" watchObservedRunningTime="2025-12-01 09:10:17.228262504 +0000 UTC m=+138.687649278" Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.263040 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7blkm" podStartSLOduration=119.263024004 podStartE2EDuration="1m59.263024004s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:17.232927662 +0000 UTC m=+138.692314436" watchObservedRunningTime="2025-12-01 09:10:17.263024004 +0000 UTC m=+138.722410758" Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.315317 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:17 crc kubenswrapper[4867]: E1201 09:10:17.315634 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:17.815618516 +0000 UTC m=+139.275005270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.323187 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-t7zlw"] Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.341351 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bdwkr"] Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.416515 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:17 crc kubenswrapper[4867]: E1201 09:10:17.416847 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:17.916835594 +0000 UTC m=+139.376222348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:17 crc kubenswrapper[4867]: W1201 09:10:17.487746 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4ba76c3_7126_4151_8aaa_5d4aa710a0ea.slice/crio-9d985e9860582f93216fadacdf805ed80d1e42b6950b0999bbd026544da6450a WatchSource:0}: Error finding container 9d985e9860582f93216fadacdf805ed80d1e42b6950b0999bbd026544da6450a: Status 404 returned error can't find the container with id 9d985e9860582f93216fadacdf805ed80d1e42b6950b0999bbd026544da6450a Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.497941 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" podStartSLOduration=119.497924531 podStartE2EDuration="1m59.497924531s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:17.494456008 +0000 UTC m=+138.953842762" watchObservedRunningTime="2025-12-01 09:10:17.497924531 +0000 UTC m=+138.957311285" Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.521362 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:17 crc kubenswrapper[4867]: E1201 09:10:17.521673 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:18.021658286 +0000 UTC m=+139.481045040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.523345 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9spn"] Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.568283 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-959fr"] Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.610558 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-szthm"] Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.625183 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:17 crc kubenswrapper[4867]: E1201 09:10:17.625499 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:18.12548817 +0000 UTC m=+139.584874914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.671923 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6w6sm"] Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.727137 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:17 crc kubenswrapper[4867]: E1201 09:10:17.727360 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:18.227343077 +0000 UTC m=+139.686729831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.729407 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h"] Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.730012 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:17 crc kubenswrapper[4867]: E1201 09:10:17.730360 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:18.230347765 +0000 UTC m=+139.689734519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.795140 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xd8vg" podStartSLOduration=121.795121484 podStartE2EDuration="2m1.795121484s" podCreationTimestamp="2025-12-01 09:08:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:17.750103874 +0000 UTC m=+139.209490628" watchObservedRunningTime="2025-12-01 09:10:17.795121484 +0000 UTC m=+139.254508238" Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.813142 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsfps" event={"ID":"05fcf7ce-2e60-4fc5-b60b-be4b8b5d4963","Type":"ContainerStarted","Data":"e5df4136ad8efc1637113cffe88ccf34cfe866610ddaac600293ee23fe1711d8"} Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.821180 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndvfn" event={"ID":"1162eaab-3879-41de-8561-e544952c9b3c","Type":"ContainerStarted","Data":"072387d306c070378b0dd3f13132155eb70ffc43081352e5c8a60fcae84fc5ea"} Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.830437 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:17 crc kubenswrapper[4867]: E1201 09:10:17.830842 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:18.330827871 +0000 UTC m=+139.790214625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.852297 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-26zfv" event={"ID":"a9e0b3a5-0f6d-47cb-a2e7-a09dc8f90d5d","Type":"ContainerStarted","Data":"6d7970fd854e99f44a51d750659502136050dcc29627431038053426f2c39ef4"} Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.861147 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57cnj" event={"ID":"2207687f-bb58-4e3e-b0ca-bc1117a21d91","Type":"ContainerStarted","Data":"8743701b256d154cfc52966c6d6b4f95b5013d97f7c3399c96a2e0987655f616"} Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.863446 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zq8v5" event={"ID":"de3134ab-8adb-4427-b246-f89bed1610ed","Type":"ContainerStarted","Data":"6b9c34b74a0e44b22ba821b1a8fe194b53aace6abc6e5e20a560b64a50959380"} Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.872378 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" podStartSLOduration=120.872359938 podStartE2EDuration="2m0.872359938s" podCreationTimestamp="2025-12-01 09:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:17.868894946 +0000 UTC m=+139.328281700" watchObservedRunningTime="2025-12-01 09:10:17.872359938 +0000 UTC m=+139.331746692" Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.872513 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kc82j" event={"ID":"cf0d3295-6cbb-4b9c-bed7-d37c328ddea4","Type":"ContainerStarted","Data":"7c26edd42482fe6c0a73431f69673f9da5dad6627e9aad4b92124d65bc31bc29"} Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.873958 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zh8bh" event={"ID":"c699ddca-61b5-4f9a-ae7c-48653d9557f8","Type":"ContainerStarted","Data":"6257b9014240bb4c22ecf19719eb959c91c298cccc8e3ac6702ee2b2cc9914f9"} Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.884393 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bdwkr" event={"ID":"ff715e51-b68c-408d-87d9-7360399c9d9d","Type":"ContainerStarted","Data":"eb27929700362f20f7823edd72bb13c5cb71bf379f8390b7101da936688cfa68"} Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.928766 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" event={"ID":"e31393e0-d2c9-4310-83a3-5958278ea7a2","Type":"ContainerStarted","Data":"321510e5b9fa21724edbfcfb1677106e380264490fbc32c308f49b0262c93a93"} Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.934966 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:17 crc kubenswrapper[4867]: E1201 09:10:17.935546 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:18.43553436 +0000 UTC m=+139.894921114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.941423 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc54g" event={"ID":"d27d4386-9536-4d85-877c-3bea1ebae95c","Type":"ContainerStarted","Data":"de0ec6591e7dd047c5f5a1d8581dd9faa786f1a1a8105ed20687d1bf4c9e47c4"} Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.956337 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bnd8f" event={"ID":"6ac0e0a1-006e-4a5b-88e8-8cb429978c2f","Type":"ContainerStarted","Data":"e194ca652c20893da3d6409ac43c8f671322b370b484d70e0094c494a3b75011"} Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.966023 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-t7zlw" event={"ID":"d4ba76c3-7126-4151-8aaa-5d4aa710a0ea","Type":"ContainerStarted","Data":"9d985e9860582f93216fadacdf805ed80d1e42b6950b0999bbd026544da6450a"} Dec 01 09:10:17 crc kubenswrapper[4867]: I1201 09:10:17.982750 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-l44jd" event={"ID":"5f2052f6-d2cd-4aba-b254-bea0cb7b6aba","Type":"ContainerStarted","Data":"7da65f1139d71d0f88f5acad2ff99688fd5cece2179e90fae0d4539b0b6094dd"} Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.036392 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:18 crc kubenswrapper[4867]: E1201 09:10:18.037761 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:18.537741656 +0000 UTC m=+139.997128420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.154441 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:18 crc kubenswrapper[4867]: E1201 09:10:18.155959 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:18.655945742 +0000 UTC m=+140.115332496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.211554 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-26zfv" podStartSLOduration=5.211533461 podStartE2EDuration="5.211533461s" podCreationTimestamp="2025-12-01 09:10:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:18.20363828 +0000 UTC m=+139.663025034" watchObservedRunningTime="2025-12-01 09:10:18.211533461 +0000 UTC m=+139.670920215" Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.213726 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-kdm2m"] Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.263352 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:18 crc kubenswrapper[4867]: E1201 09:10:18.263652 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:18.763637099 +0000 UTC m=+140.223023853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.269451 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9n2mx"] Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.303557 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wj7xq"] Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.364533 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:18 crc kubenswrapper[4867]: E1201 09:10:18.364840 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:18.864829185 +0000 UTC m=+140.324215939 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.381931 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5zdqv"] Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.391026 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c6qt4"] Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.452401 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t266c"] Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.481713 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:18 crc kubenswrapper[4867]: E1201 09:10:18.482256 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:18.982240638 +0000 UTC m=+140.441627392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.484903 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rd976"] Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.484967 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qq8rb"] Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.484981 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mph7x"] Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.533344 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-kbx9t"] Dec 01 09:10:18 crc kubenswrapper[4867]: W1201 09:10:18.555355 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc48142f7_e9a6_49ee_b638_89e198befd03.slice/crio-8920a9bdc28948f2e8b6fc4451b6bd7a47139a94c967bfca8ab6ec9372a621e4 WatchSource:0}: Error finding container 8920a9bdc28948f2e8b6fc4451b6bd7a47139a94c967bfca8ab6ec9372a621e4: Status 404 returned error can't find the container with id 8920a9bdc28948f2e8b6fc4451b6bd7a47139a94c967bfca8ab6ec9372a621e4 Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.564204 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4hs5"] Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.584102 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:18 crc kubenswrapper[4867]: E1201 09:10:18.584616 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:19.084603739 +0000 UTC m=+140.543990483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:18 crc kubenswrapper[4867]: W1201 09:10:18.594279 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e1b0e60_347c_458c_853a_301c03aeb597.slice/crio-15976a79c419b7da89f2e16d54276987dd3ca97c646867061b6199cb04b6778c WatchSource:0}: Error finding container 15976a79c419b7da89f2e16d54276987dd3ca97c646867061b6199cb04b6778c: Status 404 returned error can't find the container with id 15976a79c419b7da89f2e16d54276987dd3ca97c646867061b6199cb04b6778c Dec 01 09:10:18 crc kubenswrapper[4867]: W1201 09:10:18.616053 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda62b4517_3937_44e9_8062_982896975a9d.slice/crio-7f406b708bc9323e7477d5bf8547beb06e57bceb0466418254806a55d03b6569 WatchSource:0}: Error finding container 7f406b708bc9323e7477d5bf8547beb06e57bceb0466418254806a55d03b6569: Status 404 returned error can't find the container with id 7f406b708bc9323e7477d5bf8547beb06e57bceb0466418254806a55d03b6569 Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.680964 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-v6frd"] Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.684662 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:18 crc kubenswrapper[4867]: E1201 09:10:18.685081 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:19.185066973 +0000 UTC m=+140.644453727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:18 crc kubenswrapper[4867]: W1201 09:10:18.689830 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod997ee678_8d54_4f91_af1d_4eefd5006f85.slice/crio-37639f029d46828d76505eaddf322d1704a8f46244be3623bb58554a073063db WatchSource:0}: Error finding container 37639f029d46828d76505eaddf322d1704a8f46244be3623bb58554a073063db: Status 404 returned error can't find the container with id 37639f029d46828d76505eaddf322d1704a8f46244be3623bb58554a073063db Dec 01 09:10:18 crc kubenswrapper[4867]: W1201 09:10:18.690064 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e950295_ab80_42c7_b1f2_f5869448b824.slice/crio-bf13412102d5f57363e1c59fcbdc6f50b97c2b92b2fb07039fd83da963347a55 WatchSource:0}: Error finding container bf13412102d5f57363e1c59fcbdc6f50b97c2b92b2fb07039fd83da963347a55: Status 404 returned error can't find the container with id bf13412102d5f57363e1c59fcbdc6f50b97c2b92b2fb07039fd83da963347a55 Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.709932 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4rxmf"] Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.785851 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:18 crc kubenswrapper[4867]: E1201 09:10:18.786706 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:19.286693174 +0000 UTC m=+140.746079928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:18 crc kubenswrapper[4867]: W1201 09:10:18.826199 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod311f8a58_9f7d_4fbb_b6c2_462364f8ad76.slice/crio-7cc61d962c7759a438713729623d992872afc7db274ffc388598beefd826fcb3 WatchSource:0}: Error finding container 7cc61d962c7759a438713729623d992872afc7db274ffc388598beefd826fcb3: Status 404 returned error can't find the container with id 7cc61d962c7759a438713729623d992872afc7db274ffc388598beefd826fcb3 Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.886922 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:18 crc kubenswrapper[4867]: E1201 09:10:18.887389 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:19.387375095 +0000 UTC m=+140.846761849 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:18 crc kubenswrapper[4867]: I1201 09:10:18.990245 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:18 crc kubenswrapper[4867]: E1201 09:10:18.990570 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:19.490558239 +0000 UTC m=+140.949944993 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.018024 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-kbx9t" event={"ID":"2e876542-f5c3-495f-bb1c-72c1136686ac","Type":"ContainerStarted","Data":"e0d444673917c9f183d1299bdc79c2b23d3c996430e37e93a8ca6420cadc272a"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.029422 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" event={"ID":"c7d3f2ef-022b-41f5-84e5-6be42f48b023","Type":"ContainerStarted","Data":"ff85a59df9a2319c7da2c78eb739b04593065f603d6b6b89e2d8f559dd9a137f"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.031743 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9n2mx" event={"ID":"c48142f7-e9a6-49ee-b638-89e198befd03","Type":"ContainerStarted","Data":"8920a9bdc28948f2e8b6fc4451b6bd7a47139a94c967bfca8ab6ec9372a621e4"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.032611 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" event={"ID":"311f8a58-9f7d-4fbb-b6c2-462364f8ad76","Type":"ContainerStarted","Data":"7cc61d962c7759a438713729623d992872afc7db274ffc388598beefd826fcb3"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.040752 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndvfn" event={"ID":"1162eaab-3879-41de-8561-e544952c9b3c","Type":"ContainerStarted","Data":"5d57fa705f22e9291d38738e8d9f4b575846912c8d7fda3c8a3923c884f06f9a"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.089866 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zh8bh" event={"ID":"c699ddca-61b5-4f9a-ae7c-48653d9557f8","Type":"ContainerStarted","Data":"1ac87eaef813c92666ca4fa349b76cc7a87aa09cfb15d1799579d3e9d5fd720e"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.091122 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:19 crc kubenswrapper[4867]: E1201 09:10:19.091452 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:19.591438087 +0000 UTC m=+141.050824841 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.124895 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9spn" event={"ID":"cc49fbcc-d2e4-47a3-a1ea-7c414726b20d","Type":"ContainerStarted","Data":"f2d6030e23e92f1aa35dfde71231ed8d803c088655ee485b9c5d282cc3684bef"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.127331 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mph7x" event={"ID":"678a6c46-6e4c-4ec0-aa74-7c89c6dc00b5","Type":"ContainerStarted","Data":"d3e7079073e4edc175315e53036116f2c9f3ea91512c60b07d18df1e7be3a25e"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.130675 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p989q" event={"ID":"49436ed5-4757-4aa2-92cb-63c65928893a","Type":"ContainerStarted","Data":"a2a757aaa994e5d76a9f3552ebd4aad6006194498e32925bdf5255331b7ceca7"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.131750 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.134060 4867 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-p989q container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.134184 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-p989q" podUID="49436ed5-4757-4aa2-92cb-63c65928893a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.139321 4867 generic.go:334] "Generic (PLEG): container finished" podID="cf0d3295-6cbb-4b9c-bed7-d37c328ddea4" containerID="6e7802bb7862be1d4d3e1dce020d7574569343a85b9676489d28395086554a11" exitCode=0 Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.140235 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kc82j" event={"ID":"cf0d3295-6cbb-4b9c-bed7-d37c328ddea4","Type":"ContainerDied","Data":"6e7802bb7862be1d4d3e1dce020d7574569343a85b9676489d28395086554a11"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.193906 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:19 crc kubenswrapper[4867]: E1201 09:10:19.194227 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:19.69421629 +0000 UTC m=+141.153603044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.236285 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-szthm" event={"ID":"222fb4b4-7d0d-4305-9ba9-9f686dc10dd6","Type":"ContainerStarted","Data":"5b75adbecc20935e01365bdd282810ee7fc47ab0be43f9b754e62835183cf3a0"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.283659 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsfps" event={"ID":"05fcf7ce-2e60-4fc5-b60b-be4b8b5d4963","Type":"ContainerStarted","Data":"0deeb6b17f389ff786e716e6e8da8e1ef07a143b3ae72bd515fa27a1d4bc5007"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.291713 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h" event={"ID":"45e5dcb3-d55f-40cf-a89f-3367e84322d1","Type":"ContainerStarted","Data":"353d5f2169c28ef07dca964e4824a37cd934bf5c994c2aa49ab2cc35e516c7f5"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.296475 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:19 crc kubenswrapper[4867]: E1201 09:10:19.297377 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:19.797355594 +0000 UTC m=+141.256742348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.298560 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" event={"ID":"e31393e0-d2c9-4310-83a3-5958278ea7a2","Type":"ContainerStarted","Data":"e1ca5cd19c37fe19e3117295a9d05c407426d34a169059d4e134aae8d24692ec"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.299805 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.305840 4867 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rlc72 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" start-of-body= Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.305891 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" podUID="e31393e0-d2c9-4310-83a3-5958278ea7a2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.306330 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4hs5" event={"ID":"c9d5feec-4daa-4cea-a996-0a179e42de9f","Type":"ContainerStarted","Data":"2db4f7f9e95be07d8e5d8a13994091cb99635a88c547ff429bf14d988c4a86da"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.321342 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kdm2m" event={"ID":"ec06a7ff-9325-4de9-b47e-d8315761bf8d","Type":"ContainerStarted","Data":"84acfbda6f82e89267ec39872d471581fcb752c396e4b865f01f34d7f4ce009d"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.337139 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wj7xq" event={"ID":"2e1b0e60-347c-458c-853a-301c03aeb597","Type":"ContainerStarted","Data":"15976a79c419b7da89f2e16d54276987dd3ca97c646867061b6199cb04b6778c"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.352601 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c6qt4" event={"ID":"0db10378-99f0-49e1-9148-5a0441652430","Type":"ContainerStarted","Data":"d834f745791be8bf79e57ae6209478629b507446dc788a2d70a2ef3c30c31f2a"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.357390 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rd976" event={"ID":"997ee678-8d54-4f91-af1d-4eefd5006f85","Type":"ContainerStarted","Data":"37639f029d46828d76505eaddf322d1704a8f46244be3623bb58554a073063db"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.363692 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57cnj" event={"ID":"2207687f-bb58-4e3e-b0ca-bc1117a21d91","Type":"ContainerStarted","Data":"fa90227c671bc55018d515b0662ba149761e7c90d077db0e9781837d0b9dcc48"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.367166 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v6frd" event={"ID":"6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44","Type":"ContainerStarted","Data":"72c7fe32062e0c041af5ea2ce19cd1243e93bdbdf1ebe124deda541b9b420cc9"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.368684 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-bnd8f" Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.369494 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qq8rb" event={"ID":"7f7f1347-239a-4608-9972-2a50b5216725","Type":"ContainerStarted","Data":"f626c19be9cda504fdd1bc72ff081c0648dddd2acd36e734511a6d69824706c7"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.369619 4867 patch_prober.go:28] interesting pod/console-operator-58897d9998-bnd8f container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.369670 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bnd8f" podUID="6ac0e0a1-006e-4a5b-88e8-8cb429978c2f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.379909 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" event={"ID":"3e950295-ab80-42c7-b1f2-f5869448b824","Type":"ContainerStarted","Data":"bf13412102d5f57363e1c59fcbdc6f50b97c2b92b2fb07039fd83da963347a55"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.396314 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7kf2" event={"ID":"cf9eddc0-c88c-4527-a1db-f472b220f253","Type":"ContainerStarted","Data":"da10b8679981d885d35a8a7ba511d42ca455ba2b5544767b85cb3b2cd1333f0b"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.401550 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:19 crc kubenswrapper[4867]: E1201 09:10:19.401983 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:19.901969041 +0000 UTC m=+141.361355795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.404260 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5zdqv" event={"ID":"a62b4517-3937-44e9-8062-982896975a9d","Type":"ContainerStarted","Data":"7f406b708bc9323e7477d5bf8547beb06e57bceb0466418254806a55d03b6569"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.419756 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-959fr" event={"ID":"6e41b788-2056-4058-89de-1a8cf9885735","Type":"ContainerStarted","Data":"433f368f1cd58b63a52d905102fb5c4e2d5b7040edab05269aac05de1d3aa71c"} Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.502583 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:19 crc kubenswrapper[4867]: E1201 09:10:19.503294 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:20.003258041 +0000 UTC m=+141.462644795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.608505 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:19 crc kubenswrapper[4867]: E1201 09:10:19.610637 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:20.110609168 +0000 UTC m=+141.569995922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.711329 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:19 crc kubenswrapper[4867]: E1201 09:10:19.711979 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:20.211960239 +0000 UTC m=+141.671346993 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.712208 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:19 crc kubenswrapper[4867]: E1201 09:10:19.712529 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:20.212521145 +0000 UTC m=+141.671907899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.812771 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:19 crc kubenswrapper[4867]: E1201 09:10:19.813265 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:20.313248008 +0000 UTC m=+141.772634762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.917509 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:19 crc kubenswrapper[4867]: E1201 09:10:19.917888 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:20.417876786 +0000 UTC m=+141.877263540 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.921963 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7kf2" podStartSLOduration=121.921937525 podStartE2EDuration="2m1.921937525s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:19.906266195 +0000 UTC m=+141.365652959" watchObservedRunningTime="2025-12-01 09:10:19.921937525 +0000 UTC m=+141.381324279" Dec 01 09:10:19 crc kubenswrapper[4867]: I1201 09:10:19.946344 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-57cnj" podStartSLOduration=121.94632864 podStartE2EDuration="2m1.94632864s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:19.941017935 +0000 UTC m=+141.400404689" watchObservedRunningTime="2025-12-01 09:10:19.94632864 +0000 UTC m=+141.405715394" Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.019513 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:20 crc kubenswrapper[4867]: E1201 09:10:20.020019 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:20.52000304 +0000 UTC m=+141.979389794 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.038210 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-bnd8f" podStartSLOduration=123.038195143 podStartE2EDuration="2m3.038195143s" podCreationTimestamp="2025-12-01 09:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:19.987989122 +0000 UTC m=+141.447375876" watchObservedRunningTime="2025-12-01 09:10:20.038195143 +0000 UTC m=+141.497581897" Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.038396 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsfps" podStartSLOduration=122.03839198 podStartE2EDuration="2m2.03839198s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:20.036240296 +0000 UTC m=+141.495627050" watchObservedRunningTime="2025-12-01 09:10:20.03839198 +0000 UTC m=+141.497778734" Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.130191 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:20 crc kubenswrapper[4867]: E1201 09:10:20.130561 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:20.63054748 +0000 UTC m=+142.089934234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.184206 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" podStartSLOduration=122.184186324 podStartE2EDuration="2m2.184186324s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:20.137465894 +0000 UTC m=+141.596852648" watchObservedRunningTime="2025-12-01 09:10:20.184186324 +0000 UTC m=+141.643573078" Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.231986 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:20 crc kubenswrapper[4867]: E1201 09:10:20.232229 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:20.732215422 +0000 UTC m=+142.191602176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.245946 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-p989q" podStartSLOduration=123.245930804 podStartE2EDuration="2m3.245930804s" podCreationTimestamp="2025-12-01 09:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:20.245607794 +0000 UTC m=+141.704994548" watchObservedRunningTime="2025-12-01 09:10:20.245930804 +0000 UTC m=+141.705317558" Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.246866 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zq8v5" podStartSLOduration=122.246861531 podStartE2EDuration="2m2.246861531s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:20.182097802 +0000 UTC m=+141.641484556" watchObservedRunningTime="2025-12-01 09:10:20.246861531 +0000 UTC m=+141.706248285" Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.334303 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:20 crc kubenswrapper[4867]: E1201 09:10:20.334773 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:20.834762867 +0000 UTC m=+142.294149611 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.435309 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:20 crc kubenswrapper[4867]: E1201 09:10:20.435651 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:20.935638155 +0000 UTC m=+142.395024909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.438710 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" event={"ID":"3e950295-ab80-42c7-b1f2-f5869448b824","Type":"ContainerStarted","Data":"b71e718a04c208c49be5a98460a84df34e0a5c40c09722512a84e8a85e134a33"} Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.444876 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h" event={"ID":"45e5dcb3-d55f-40cf-a89f-3367e84322d1","Type":"ContainerStarted","Data":"8f899fc084ad650caac3f1299eb696d18ff910f8c08ff8465a177917e24b2f4e"} Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.460068 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bnd8f" event={"ID":"6ac0e0a1-006e-4a5b-88e8-8cb429978c2f","Type":"ContainerStarted","Data":"bd2c99b84b6cb7170d210248e212b834a932597075471127950531ed0e3ae016"} Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.466386 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-t7zlw" event={"ID":"d4ba76c3-7126-4151-8aaa-5d4aa710a0ea","Type":"ContainerStarted","Data":"69f185f5d0c2d9026b74820dc265e65d7cb688a8bd131dba402aa467d8ade5ff"} Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.468068 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-l44jd" event={"ID":"5f2052f6-d2cd-4aba-b254-bea0cb7b6aba","Type":"ContainerStarted","Data":"02bce9e833c5ac8ad0c2fec607a735b2e02f4096853e199461ca1164c0b6e9e0"} Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.469209 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" event={"ID":"c7d3f2ef-022b-41f5-84e5-6be42f48b023","Type":"ContainerStarted","Data":"b066b91b18ccb5da1e9ab6c2316b6294409b3b18f2276443c434778248a1dc18"} Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.469751 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.479605 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-t266c" podStartSLOduration=122.479571873 podStartE2EDuration="2m2.479571873s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:20.478403089 +0000 UTC m=+141.937789843" watchObservedRunningTime="2025-12-01 09:10:20.479571873 +0000 UTC m=+141.938958627" Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.480775 4867 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6w6sm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.480825 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" podUID="c7d3f2ef-022b-41f5-84e5-6be42f48b023" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.481281 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bdwkr" event={"ID":"ff715e51-b68c-408d-87d9-7360399c9d9d","Type":"ContainerStarted","Data":"0c666b08e833595f36836aaf981a05da1a5b32fe0350ea789cf237d12adf6333"} Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.482423 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-szthm" event={"ID":"222fb4b4-7d0d-4305-9ba9-9f686dc10dd6","Type":"ContainerStarted","Data":"f1a12211081907ce5a08e978b69bd5079bc96281f7aab1913fca46227ff7ee78"} Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.482934 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-szthm" Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.485632 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rd976" event={"ID":"997ee678-8d54-4f91-af1d-4eefd5006f85","Type":"ContainerStarted","Data":"10b034096311d88223d3233d654f0fcd26a5f476102e7e34bc2c751f2fdebbda"} Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.489061 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.497767 4867 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-szthm container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.497829 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-szthm" podUID="222fb4b4-7d0d-4305-9ba9-9f686dc10dd6" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.497858 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9spn" event={"ID":"cc49fbcc-d2e4-47a3-a1ea-7c414726b20d","Type":"ContainerStarted","Data":"03d49a8ebc126c0a6aba18ec06926cdb390c3d4620066b122dcbff346c55228c"} Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.498220 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9spn" Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.522529 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" event={"ID":"8e6598f7-031a-4561-bde4-ae61121a17cd","Type":"ContainerStarted","Data":"27d9253719195f88408e94d8563874801985619bf1fe65f4077978e0a039be87"} Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.526120 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-t7zlw" podStartSLOduration=123.526097688 podStartE2EDuration="2m3.526097688s" podCreationTimestamp="2025-12-01 09:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:20.521444521 +0000 UTC m=+141.980831265" watchObservedRunningTime="2025-12-01 09:10:20.526097688 +0000 UTC m=+141.985484442" Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.537729 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:20 crc kubenswrapper[4867]: E1201 09:10:20.544403 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:21.044387223 +0000 UTC m=+142.503773977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.576960 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:20 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:20 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:20 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.577069 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.589421 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zq8v5" event={"ID":"de3134ab-8adb-4427-b246-f89bed1610ed","Type":"ContainerStarted","Data":"2824480fea1adb6a660ff895ec148b1a499795f6218c44e3cd73d1fd705cc032"} Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.592208 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h" podStartSLOduration=123.592197525 podStartE2EDuration="2m3.592197525s" podCreationTimestamp="2025-12-01 09:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:20.590414653 +0000 UTC m=+142.049801407" watchObservedRunningTime="2025-12-01 09:10:20.592197525 +0000 UTC m=+142.051584279" Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.613344 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9spn" Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.628135 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-959fr" event={"ID":"6e41b788-2056-4058-89de-1a8cf9885735","Type":"ContainerStarted","Data":"362fa73992f5f23be6eecb3b367cd8e8b53d67e767cb70539f8d798df0f3168b"} Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.636757 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" event={"ID":"8b075132-2629-49ad-9361-42fe48ae5b57","Type":"ContainerStarted","Data":"f8f3602e974427e3cd4532d7486cc1b3c06854d5cd278f95f8d0beeabce96fa0"} Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.645451 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:20 crc kubenswrapper[4867]: E1201 09:10:20.649139 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:21.149114684 +0000 UTC m=+142.608501438 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.651633 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kdm2m" event={"ID":"ec06a7ff-9325-4de9-b47e-d8315761bf8d","Type":"ContainerStarted","Data":"004aaa5e09b068e15bd178a988aa4767f5bf55589cd10f4b938d0647ea7f02f1"} Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.661424 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wj7xq" event={"ID":"2e1b0e60-347c-458c-853a-301c03aeb597","Type":"ContainerStarted","Data":"ce59bce8cb0b737b03e8f2cc037b6c3faef9ad111ab35f31ba11ba2f98579fe4"} Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.661636 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:20 crc kubenswrapper[4867]: E1201 09:10:20.662721 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:21.162709302 +0000 UTC m=+142.622096056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.691613 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc54g" event={"ID":"d27d4386-9536-4d85-877c-3bea1ebae95c","Type":"ContainerStarted","Data":"37ce7b6daa1b4f38fa17e78ffe23b5aff311546b3b06ff946150de2f49e99627"} Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.691873 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc54g" Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.693848 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-zh8bh" Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.698198 4867 patch_prober.go:28] interesting pod/downloads-7954f5f757-zh8bh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.698244 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zh8bh" podUID="c699ddca-61b5-4f9a-ae7c-48653d9557f8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.764586 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:20 crc kubenswrapper[4867]: E1201 09:10:20.765802 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:21.265775924 +0000 UTC m=+142.725162678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.870911 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:20 crc kubenswrapper[4867]: E1201 09:10:20.871578 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:21.371563235 +0000 UTC m=+142.830949989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:20 crc kubenswrapper[4867]: I1201 09:10:20.971992 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:20 crc kubenswrapper[4867]: E1201 09:10:20.972313 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:21.472297348 +0000 UTC m=+142.931684102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.006029 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" podStartSLOduration=123.006015757 podStartE2EDuration="2m3.006015757s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:20.764235729 +0000 UTC m=+142.223622473" watchObservedRunningTime="2025-12-01 09:10:21.006015757 +0000 UTC m=+142.465402511" Dec 01 09:10:21 crc kubenswrapper[4867]: E1201 09:10:21.076132 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:21.576117612 +0000 UTC m=+143.035504366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.076170 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.148185 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-l44jd" podStartSLOduration=123.148165185 podStartE2EDuration="2m3.148165185s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:21.006865431 +0000 UTC m=+142.466252185" watchObservedRunningTime="2025-12-01 09:10:21.148165185 +0000 UTC m=+142.607551939" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.177025 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:21 crc kubenswrapper[4867]: E1201 09:10:21.177248 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:21.677217356 +0000 UTC m=+143.136604110 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.177330 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:21 crc kubenswrapper[4867]: E1201 09:10:21.177647 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:21.677634588 +0000 UTC m=+143.137021342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.276685 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc54g" podStartSLOduration=123.276664722 podStartE2EDuration="2m3.276664722s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:21.195388829 +0000 UTC m=+142.654775583" watchObservedRunningTime="2025-12-01 09:10:21.276664722 +0000 UTC m=+142.736051476" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.279098 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:21 crc kubenswrapper[4867]: E1201 09:10:21.279540 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:21.779521936 +0000 UTC m=+143.238908690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.339258 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wj7xq" podStartSLOduration=123.339240626 podStartE2EDuration="2m3.339240626s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:21.334769905 +0000 UTC m=+142.794156649" watchObservedRunningTime="2025-12-01 09:10:21.339240626 +0000 UTC m=+142.798627380" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.340764 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-zh8bh" podStartSLOduration=124.34075181 podStartE2EDuration="2m4.34075181s" podCreationTimestamp="2025-12-01 09:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:21.278523296 +0000 UTC m=+142.737910050" watchObservedRunningTime="2025-12-01 09:10:21.34075181 +0000 UTC m=+142.800138564" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.380966 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:21 crc kubenswrapper[4867]: E1201 09:10:21.381343 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:21.88132822 +0000 UTC m=+143.340714974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.396738 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-c9spn" podStartSLOduration=123.3967149 podStartE2EDuration="2m3.3967149s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:21.391587051 +0000 UTC m=+142.850973815" watchObservedRunningTime="2025-12-01 09:10:21.3967149 +0000 UTC m=+142.856101654" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.462025 4867 patch_prober.go:28] interesting pod/console-operator-58897d9998-bnd8f container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.462311 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bnd8f" podUID="6ac0e0a1-006e-4a5b-88e8-8cb429978c2f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.482421 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:21 crc kubenswrapper[4867]: E1201 09:10:21.482823 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:21.982776164 +0000 UTC m=+143.442162918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.491216 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" podStartSLOduration=123.491196881 podStartE2EDuration="2m3.491196881s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:21.489766649 +0000 UTC m=+142.949153403" watchObservedRunningTime="2025-12-01 09:10:21.491196881 +0000 UTC m=+142.950583635" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.498999 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:21 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:21 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:21 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.499051 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.551143 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-kdm2m" podStartSLOduration=124.551123558 podStartE2EDuration="2m4.551123558s" podCreationTimestamp="2025-12-01 09:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:21.548964005 +0000 UTC m=+143.008350759" watchObservedRunningTime="2025-12-01 09:10:21.551123558 +0000 UTC m=+143.010510312" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.584780 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:21 crc kubenswrapper[4867]: E1201 09:10:21.585171 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:22.085157346 +0000 UTC m=+143.544544100 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.600880 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.600937 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.685611 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:21 crc kubenswrapper[4867]: E1201 09:10:21.685804 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:22.185775836 +0000 UTC m=+143.645162600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.686040 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:21 crc kubenswrapper[4867]: E1201 09:10:21.686412 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:22.186396114 +0000 UTC m=+143.645782868 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.693483 4867 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rlc72 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.693542 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" podUID="e31393e0-d2c9-4310-83a3-5958278ea7a2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.694958 4867 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-p989q container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.695021 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-p989q" podUID="49436ed5-4757-4aa2-92cb-63c65928893a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.720433 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-959fr" event={"ID":"6e41b788-2056-4058-89de-1a8cf9885735","Type":"ContainerStarted","Data":"03ee05a4222fa30f70c346d0e7cffa5520ffab2db3460050286ce2491cc8da75"} Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.729640 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qq8rb" event={"ID":"7f7f1347-239a-4608-9972-2a50b5216725","Type":"ContainerStarted","Data":"21fdca160490660bcf0a5645476236b228476387ea3b364540f3f9ff7b45eee2"} Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.736446 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" event={"ID":"8b075132-2629-49ad-9361-42fe48ae5b57","Type":"ContainerStarted","Data":"7bd5ab2e01e2c692e8f99a221630d2c9db3b71bf008d9a5a0086b0eefc623a75"} Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.752180 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-kbx9t" event={"ID":"2e876542-f5c3-495f-bb1c-72c1136686ac","Type":"ContainerStarted","Data":"eaa586f553b3d429edc116b9facaa8f210f2317b8a8ed821440745b4597ef338"} Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.752241 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-kbx9t" event={"ID":"2e876542-f5c3-495f-bb1c-72c1136686ac","Type":"ContainerStarted","Data":"e335a1d73b78af3a284900e20bc5660fdadce0a9c2a1459e7086ca3d937b3987"} Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.763349 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kc82j" event={"ID":"cf0d3295-6cbb-4b9c-bed7-d37c328ddea4","Type":"ContainerStarted","Data":"751f96896c38862859c892585adf6fd4c7d4cc42d2af01186e11c3286979f9a3"} Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.763509 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kc82j" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.766126 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mph7x" event={"ID":"678a6c46-6e4c-4ec0-aa74-7c89c6dc00b5","Type":"ContainerStarted","Data":"e1d11fd665f987e1a7417523e4f450493b1405b8131792b1ba505a12cc5bff9c"} Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.777968 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c6qt4" event={"ID":"0db10378-99f0-49e1-9148-5a0441652430","Type":"ContainerStarted","Data":"abfd5e310e73fb3e03fc7f4644a5bd49d1aacd26101242b554962f15a6d57204"} Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.780028 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-szthm" podStartSLOduration=123.780015488 podStartE2EDuration="2m3.780015488s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:21.626848958 +0000 UTC m=+143.086235722" watchObservedRunningTime="2025-12-01 09:10:21.780015488 +0000 UTC m=+143.239402242" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.780863 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-959fr" podStartSLOduration=123.780857893 podStartE2EDuration="2m3.780857893s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:21.773120916 +0000 UTC m=+143.232507670" watchObservedRunningTime="2025-12-01 09:10:21.780857893 +0000 UTC m=+143.240244647" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.780898 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" event={"ID":"311f8a58-9f7d-4fbb-b6c2-462364f8ad76","Type":"ContainerStarted","Data":"46ec8b168c7b42314146d8f17ba423530ba79ac38d54bb46f72c2bda271bd994"} Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.789436 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:21 crc kubenswrapper[4867]: E1201 09:10:21.789786 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:22.289755934 +0000 UTC m=+143.749142708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.790320 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:21 crc kubenswrapper[4867]: E1201 09:10:21.791410 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:22.291391201 +0000 UTC m=+143.750778025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.797907 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v6frd" event={"ID":"6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44","Type":"ContainerStarted","Data":"a731f0a6850cf7d5ff4c0fc3acbc53b936e0f1d5e6ed83625868fc5082706257"} Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.797958 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v6frd" event={"ID":"6bb7434b-bdaa-44aa-bdc1-6ce96faa7e44","Type":"ContainerStarted","Data":"2024600ffd1e0a466217859e86b67f0952c151313bca4b6cbbbf9095ab743e09"} Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.798603 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-v6frd" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.812163 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rd976" event={"ID":"997ee678-8d54-4f91-af1d-4eefd5006f85","Type":"ContainerStarted","Data":"3d55b1eaf7ef500e2b1c02147c15d034ebd74aa7daabc763b269a9c0ad22259e"} Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.838317 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4hs5" event={"ID":"c9d5feec-4daa-4cea-a996-0a179e42de9f","Type":"ContainerStarted","Data":"a624e2af7610447a84cbcaa76d0730a9caced9ea4f83fed5af6062f5579e2b11"} Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.845103 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5zdqv" event={"ID":"a62b4517-3937-44e9-8062-982896975a9d","Type":"ContainerStarted","Data":"a52f5c79362c75855415cbc334536ecbe5c78d189fd66eed9f4a4a746220c50b"} Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.861123 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bdwkr" event={"ID":"ff715e51-b68c-408d-87d9-7360399c9d9d","Type":"ContainerStarted","Data":"67470e61126ec42a13d13cb322e286f730d5e95c096c68d7afa426da5e07c2dc"} Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.861597 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" podStartSLOduration=124.861584999 podStartE2EDuration="2m4.861584999s" podCreationTimestamp="2025-12-01 09:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:21.860660682 +0000 UTC m=+143.320047436" watchObservedRunningTime="2025-12-01 09:10:21.861584999 +0000 UTC m=+143.320971763" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.863854 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-kbx9t" podStartSLOduration=124.863838975 podStartE2EDuration="2m4.863838975s" podCreationTimestamp="2025-12-01 09:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:21.818220558 +0000 UTC m=+143.277607312" watchObservedRunningTime="2025-12-01 09:10:21.863838975 +0000 UTC m=+143.323225739" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.876623 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndvfn" event={"ID":"1162eaab-3879-41de-8561-e544952c9b3c","Type":"ContainerStarted","Data":"84e90b97c8aa8fe0e437e5e27df6ba06835b9341820123860770121fe33baa21"} Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.878747 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9n2mx" event={"ID":"c48142f7-e9a6-49ee-b638-89e198befd03","Type":"ContainerStarted","Data":"19745d4ff6b05e89e239948dfc513b95f7bca456949d56cb55931135dd4fe3d4"} Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.881883 4867 patch_prober.go:28] interesting pod/downloads-7954f5f757-zh8bh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.881952 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zh8bh" podUID="c699ddca-61b5-4f9a-ae7c-48653d9557f8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.882368 4867 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6w6sm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.882405 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" podUID="c7d3f2ef-022b-41f5-84e5-6be42f48b023" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.894798 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:21 crc kubenswrapper[4867]: E1201 09:10:21.894934 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:22.394914976 +0000 UTC m=+143.854301730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.895472 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:21 crc kubenswrapper[4867]: E1201 09:10:21.898290 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:22.398257494 +0000 UTC m=+143.857644248 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.989584 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mph7x" podStartSLOduration=123.989567702 podStartE2EDuration="2m3.989567702s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:21.906078084 +0000 UTC m=+143.365464848" watchObservedRunningTime="2025-12-01 09:10:21.989567702 +0000 UTC m=+143.448954456" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.990648 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-bnd8f" Dec 01 09:10:21 crc kubenswrapper[4867]: I1201 09:10:21.991650 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kc82j" podStartSLOduration=124.991642553 podStartE2EDuration="2m4.991642553s" podCreationTimestamp="2025-12-01 09:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:21.987651635 +0000 UTC m=+143.447038389" watchObservedRunningTime="2025-12-01 09:10:21.991642553 +0000 UTC m=+143.451029307" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:21.998340 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:22 crc kubenswrapper[4867]: E1201 09:10:22.000586 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:22.500555714 +0000 UTC m=+143.959942488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.010355 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qq8rb" podStartSLOduration=9.0103372 podStartE2EDuration="9.0103372s" podCreationTimestamp="2025-12-01 09:10:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:22.008320582 +0000 UTC m=+143.467707336" watchObservedRunningTime="2025-12-01 09:10:22.0103372 +0000 UTC m=+143.469723954" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.068370 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-szthm" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.077001 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlc72" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.100666 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:22 crc kubenswrapper[4867]: E1201 09:10:22.100986 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:22.600974848 +0000 UTC m=+144.060361602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.114345 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ndvfn" podStartSLOduration=124.114323299 podStartE2EDuration="2m4.114323299s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:22.087316737 +0000 UTC m=+143.546703491" watchObservedRunningTime="2025-12-01 09:10:22.114323299 +0000 UTC m=+143.573710053" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.127473 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-9n2mx" podStartSLOduration=124.127447974 podStartE2EDuration="2m4.127447974s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:22.119698756 +0000 UTC m=+143.579085550" watchObservedRunningTime="2025-12-01 09:10:22.127447974 +0000 UTC m=+143.586834968" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.198515 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c6qt4" podStartSLOduration=124.198498867 podStartE2EDuration="2m4.198498867s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:22.179070657 +0000 UTC m=+143.638457411" watchObservedRunningTime="2025-12-01 09:10:22.198498867 +0000 UTC m=+143.657885621" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.201506 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:22 crc kubenswrapper[4867]: E1201 09:10:22.201959 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:22.701943078 +0000 UTC m=+144.161329832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.252509 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rd976" podStartSLOduration=124.2524917 podStartE2EDuration="2m4.2524917s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:22.246869665 +0000 UTC m=+143.706256419" watchObservedRunningTime="2025-12-01 09:10:22.2524917 +0000 UTC m=+143.711878454" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.252894 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5zdqv" podStartSLOduration=125.252890021 podStartE2EDuration="2m5.252890021s" podCreationTimestamp="2025-12-01 09:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:22.201771903 +0000 UTC m=+143.661158657" watchObservedRunningTime="2025-12-01 09:10:22.252890021 +0000 UTC m=+143.712276775" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.303249 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:22 crc kubenswrapper[4867]: E1201 09:10:22.303588 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:22.803574727 +0000 UTC m=+144.262961481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.305478 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4hs5" podStartSLOduration=125.305455683 podStartE2EDuration="2m5.305455683s" podCreationTimestamp="2025-12-01 09:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:22.302690732 +0000 UTC m=+143.762077486" watchObservedRunningTime="2025-12-01 09:10:22.305455683 +0000 UTC m=+143.764842437" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.330939 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-bdwkr" podStartSLOduration=124.330919599 podStartE2EDuration="2m4.330919599s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:22.328789136 +0000 UTC m=+143.788175900" watchObservedRunningTime="2025-12-01 09:10:22.330919599 +0000 UTC m=+143.790306353" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.407018 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:22 crc kubenswrapper[4867]: E1201 09:10:22.407166 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:22.907140403 +0000 UTC m=+144.366527157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.407277 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:22 crc kubenswrapper[4867]: E1201 09:10:22.407602 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:22.907594077 +0000 UTC m=+144.366980841 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.415448 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-v6frd" podStartSLOduration=9.415423426 podStartE2EDuration="9.415423426s" podCreationTimestamp="2025-12-01 09:10:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:22.383060417 +0000 UTC m=+143.842447181" watchObservedRunningTime="2025-12-01 09:10:22.415423426 +0000 UTC m=+143.874810180" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.496584 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:22 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:22 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:22 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.496673 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.508365 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:22 crc kubenswrapper[4867]: E1201 09:10:22.508574 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:23.008542996 +0000 UTC m=+144.467929760 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.508624 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:22 crc kubenswrapper[4867]: E1201 09:10:22.509072 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:23.009056301 +0000 UTC m=+144.468443055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.609826 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:22 crc kubenswrapper[4867]: E1201 09:10:22.609996 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:23.109966999 +0000 UTC m=+144.569353753 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.610141 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:22 crc kubenswrapper[4867]: E1201 09:10:22.610413 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:23.110399553 +0000 UTC m=+144.569786307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.653188 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kfrqj"] Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.654405 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfrqj" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.657186 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.662864 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kfrqj"] Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.712732 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:22 crc kubenswrapper[4867]: E1201 09:10:22.713341 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:23.213310709 +0000 UTC m=+144.672697473 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.713577 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.713748 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h4gh\" (UniqueName: \"kubernetes.io/projected/5a77c808-2899-42d0-95e6-72f00df5432f-kube-api-access-8h4gh\") pod \"certified-operators-kfrqj\" (UID: \"5a77c808-2899-42d0-95e6-72f00df5432f\") " pod="openshift-marketplace/certified-operators-kfrqj" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.713908 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a77c808-2899-42d0-95e6-72f00df5432f-catalog-content\") pod \"certified-operators-kfrqj\" (UID: \"5a77c808-2899-42d0-95e6-72f00df5432f\") " pod="openshift-marketplace/certified-operators-kfrqj" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.714065 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a77c808-2899-42d0-95e6-72f00df5432f-utilities\") pod \"certified-operators-kfrqj\" (UID: \"5a77c808-2899-42d0-95e6-72f00df5432f\") " pod="openshift-marketplace/certified-operators-kfrqj" Dec 01 09:10:22 crc kubenswrapper[4867]: E1201 09:10:22.714466 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:23.214451443 +0000 UTC m=+144.673838207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.815016 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.815307 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h4gh\" (UniqueName: \"kubernetes.io/projected/5a77c808-2899-42d0-95e6-72f00df5432f-kube-api-access-8h4gh\") pod \"certified-operators-kfrqj\" (UID: \"5a77c808-2899-42d0-95e6-72f00df5432f\") " pod="openshift-marketplace/certified-operators-kfrqj" Dec 01 09:10:22 crc kubenswrapper[4867]: E1201 09:10:22.815443 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:23.31531681 +0000 UTC m=+144.774703564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.816644 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a77c808-2899-42d0-95e6-72f00df5432f-catalog-content\") pod \"certified-operators-kfrqj\" (UID: \"5a77c808-2899-42d0-95e6-72f00df5432f\") " pod="openshift-marketplace/certified-operators-kfrqj" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.815802 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a77c808-2899-42d0-95e6-72f00df5432f-catalog-content\") pod \"certified-operators-kfrqj\" (UID: \"5a77c808-2899-42d0-95e6-72f00df5432f\") " pod="openshift-marketplace/certified-operators-kfrqj" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.816791 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a77c808-2899-42d0-95e6-72f00df5432f-utilities\") pod \"certified-operators-kfrqj\" (UID: \"5a77c808-2899-42d0-95e6-72f00df5432f\") " pod="openshift-marketplace/certified-operators-kfrqj" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.817307 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a77c808-2899-42d0-95e6-72f00df5432f-utilities\") pod \"certified-operators-kfrqj\" (UID: \"5a77c808-2899-42d0-95e6-72f00df5432f\") " pod="openshift-marketplace/certified-operators-kfrqj" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.876403 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h4gh\" (UniqueName: \"kubernetes.io/projected/5a77c808-2899-42d0-95e6-72f00df5432f-kube-api-access-8h4gh\") pod \"certified-operators-kfrqj\" (UID: \"5a77c808-2899-42d0-95e6-72f00df5432f\") " pod="openshift-marketplace/certified-operators-kfrqj" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.883660 4867 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6w6sm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.883710 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" podUID="c7d3f2ef-022b-41f5-84e5-6be42f48b023" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.887671 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5wrtl"] Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.888659 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wrtl" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.913180 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.918605 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:22 crc kubenswrapper[4867]: E1201 09:10:22.922963 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:23.422946605 +0000 UTC m=+144.882333449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.970594 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfrqj" Dec 01 09:10:22 crc kubenswrapper[4867]: I1201 09:10:22.981266 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5wrtl"] Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.020971 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.021162 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65b95\" (UniqueName: \"kubernetes.io/projected/4702840c-d6fa-4dcd-bd95-4ac89f95d727-kube-api-access-65b95\") pod \"community-operators-5wrtl\" (UID: \"4702840c-d6fa-4dcd-bd95-4ac89f95d727\") " pod="openshift-marketplace/community-operators-5wrtl" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.021202 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4702840c-d6fa-4dcd-bd95-4ac89f95d727-catalog-content\") pod \"community-operators-5wrtl\" (UID: \"4702840c-d6fa-4dcd-bd95-4ac89f95d727\") " pod="openshift-marketplace/community-operators-5wrtl" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.021287 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4702840c-d6fa-4dcd-bd95-4ac89f95d727-utilities\") pod \"community-operators-5wrtl\" (UID: \"4702840c-d6fa-4dcd-bd95-4ac89f95d727\") " pod="openshift-marketplace/community-operators-5wrtl" Dec 01 09:10:23 crc kubenswrapper[4867]: E1201 09:10:23.021370 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:23.5213556 +0000 UTC m=+144.980742354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.115748 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-znqxw"] Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.116934 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-znqxw" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.124394 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65b95\" (UniqueName: \"kubernetes.io/projected/4702840c-d6fa-4dcd-bd95-4ac89f95d727-kube-api-access-65b95\") pod \"community-operators-5wrtl\" (UID: \"4702840c-d6fa-4dcd-bd95-4ac89f95d727\") " pod="openshift-marketplace/community-operators-5wrtl" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.124433 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4702840c-d6fa-4dcd-bd95-4ac89f95d727-catalog-content\") pod \"community-operators-5wrtl\" (UID: \"4702840c-d6fa-4dcd-bd95-4ac89f95d727\") " pod="openshift-marketplace/community-operators-5wrtl" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.124486 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.124543 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4702840c-d6fa-4dcd-bd95-4ac89f95d727-utilities\") pod \"community-operators-5wrtl\" (UID: \"4702840c-d6fa-4dcd-bd95-4ac89f95d727\") " pod="openshift-marketplace/community-operators-5wrtl" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.124905 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4702840c-d6fa-4dcd-bd95-4ac89f95d727-utilities\") pod \"community-operators-5wrtl\" (UID: \"4702840c-d6fa-4dcd-bd95-4ac89f95d727\") " pod="openshift-marketplace/community-operators-5wrtl" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.125382 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4702840c-d6fa-4dcd-bd95-4ac89f95d727-catalog-content\") pod \"community-operators-5wrtl\" (UID: \"4702840c-d6fa-4dcd-bd95-4ac89f95d727\") " pod="openshift-marketplace/community-operators-5wrtl" Dec 01 09:10:23 crc kubenswrapper[4867]: E1201 09:10:23.125598 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:23.625587657 +0000 UTC m=+145.084974411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.173656 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-znqxw"] Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.225232 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.225411 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6f7e7b3-ad0a-41ab-8291-c5200fe31a88-utilities\") pod \"certified-operators-znqxw\" (UID: \"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88\") " pod="openshift-marketplace/certified-operators-znqxw" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.225447 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkgmj\" (UniqueName: \"kubernetes.io/projected/d6f7e7b3-ad0a-41ab-8291-c5200fe31a88-kube-api-access-vkgmj\") pod \"certified-operators-znqxw\" (UID: \"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88\") " pod="openshift-marketplace/certified-operators-znqxw" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.225683 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6f7e7b3-ad0a-41ab-8291-c5200fe31a88-catalog-content\") pod \"certified-operators-znqxw\" (UID: \"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88\") " pod="openshift-marketplace/certified-operators-znqxw" Dec 01 09:10:23 crc kubenswrapper[4867]: E1201 09:10:23.225770 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:23.725754942 +0000 UTC m=+145.185141696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.249598 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65b95\" (UniqueName: \"kubernetes.io/projected/4702840c-d6fa-4dcd-bd95-4ac89f95d727-kube-api-access-65b95\") pod \"community-operators-5wrtl\" (UID: \"4702840c-d6fa-4dcd-bd95-4ac89f95d727\") " pod="openshift-marketplace/community-operators-5wrtl" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.327619 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkgmj\" (UniqueName: \"kubernetes.io/projected/d6f7e7b3-ad0a-41ab-8291-c5200fe31a88-kube-api-access-vkgmj\") pod \"certified-operators-znqxw\" (UID: \"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88\") " pod="openshift-marketplace/certified-operators-znqxw" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.327709 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.327834 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6f7e7b3-ad0a-41ab-8291-c5200fe31a88-catalog-content\") pod \"certified-operators-znqxw\" (UID: \"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88\") " pod="openshift-marketplace/certified-operators-znqxw" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.327860 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6f7e7b3-ad0a-41ab-8291-c5200fe31a88-utilities\") pod \"certified-operators-znqxw\" (UID: \"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88\") " pod="openshift-marketplace/certified-operators-znqxw" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.328284 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6f7e7b3-ad0a-41ab-8291-c5200fe31a88-utilities\") pod \"certified-operators-znqxw\" (UID: \"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88\") " pod="openshift-marketplace/certified-operators-znqxw" Dec 01 09:10:23 crc kubenswrapper[4867]: E1201 09:10:23.328852 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:23.828836415 +0000 UTC m=+145.288223189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.329242 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6f7e7b3-ad0a-41ab-8291-c5200fe31a88-catalog-content\") pod \"certified-operators-znqxw\" (UID: \"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88\") " pod="openshift-marketplace/certified-operators-znqxw" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.340860 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zr64t"] Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.342050 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zr64t" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.365316 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zr64t"] Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.432297 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.432586 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/736579d0-4c1c-4766-abbd-4d681839bd0b-utilities\") pod \"community-operators-zr64t\" (UID: \"736579d0-4c1c-4766-abbd-4d681839bd0b\") " pod="openshift-marketplace/community-operators-zr64t" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.432660 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2pfj\" (UniqueName: \"kubernetes.io/projected/736579d0-4c1c-4766-abbd-4d681839bd0b-kube-api-access-r2pfj\") pod \"community-operators-zr64t\" (UID: \"736579d0-4c1c-4766-abbd-4d681839bd0b\") " pod="openshift-marketplace/community-operators-zr64t" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.432716 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/736579d0-4c1c-4766-abbd-4d681839bd0b-catalog-content\") pod \"community-operators-zr64t\" (UID: \"736579d0-4c1c-4766-abbd-4d681839bd0b\") " pod="openshift-marketplace/community-operators-zr64t" Dec 01 09:10:23 crc kubenswrapper[4867]: E1201 09:10:23.432883 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:23.932862494 +0000 UTC m=+145.392249248 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.452499 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkgmj\" (UniqueName: \"kubernetes.io/projected/d6f7e7b3-ad0a-41ab-8291-c5200fe31a88-kube-api-access-vkgmj\") pod \"certified-operators-znqxw\" (UID: \"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88\") " pod="openshift-marketplace/certified-operators-znqxw" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.462680 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-znqxw" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.500335 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:23 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:23 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:23 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.500393 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.509093 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wrtl" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.536582 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2pfj\" (UniqueName: \"kubernetes.io/projected/736579d0-4c1c-4766-abbd-4d681839bd0b-kube-api-access-r2pfj\") pod \"community-operators-zr64t\" (UID: \"736579d0-4c1c-4766-abbd-4d681839bd0b\") " pod="openshift-marketplace/community-operators-zr64t" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.536639 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/736579d0-4c1c-4766-abbd-4d681839bd0b-catalog-content\") pod \"community-operators-zr64t\" (UID: \"736579d0-4c1c-4766-abbd-4d681839bd0b\") " pod="openshift-marketplace/community-operators-zr64t" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.536695 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.536741 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/736579d0-4c1c-4766-abbd-4d681839bd0b-utilities\") pod \"community-operators-zr64t\" (UID: \"736579d0-4c1c-4766-abbd-4d681839bd0b\") " pod="openshift-marketplace/community-operators-zr64t" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.537173 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/736579d0-4c1c-4766-abbd-4d681839bd0b-utilities\") pod \"community-operators-zr64t\" (UID: \"736579d0-4c1c-4766-abbd-4d681839bd0b\") " pod="openshift-marketplace/community-operators-zr64t" Dec 01 09:10:23 crc kubenswrapper[4867]: E1201 09:10:23.538871 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:24.038858212 +0000 UTC m=+145.498244966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.548117 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/736579d0-4c1c-4766-abbd-4d681839bd0b-catalog-content\") pod \"community-operators-zr64t\" (UID: \"736579d0-4c1c-4766-abbd-4d681839bd0b\") " pod="openshift-marketplace/community-operators-zr64t" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.578750 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2pfj\" (UniqueName: \"kubernetes.io/projected/736579d0-4c1c-4766-abbd-4d681839bd0b-kube-api-access-r2pfj\") pod \"community-operators-zr64t\" (UID: \"736579d0-4c1c-4766-abbd-4d681839bd0b\") " pod="openshift-marketplace/community-operators-zr64t" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.638591 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:23 crc kubenswrapper[4867]: E1201 09:10:23.638919 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:24.138904775 +0000 UTC m=+145.598291529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.705527 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zr64t" Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.739480 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:23 crc kubenswrapper[4867]: E1201 09:10:23.739846 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:24.239833254 +0000 UTC m=+145.699220008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.841093 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:23 crc kubenswrapper[4867]: E1201 09:10:23.841404 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:24.341389281 +0000 UTC m=+145.800776025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.899769 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kfrqj"] Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.914899 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" event={"ID":"311f8a58-9f7d-4fbb-b6c2-462364f8ad76","Type":"ContainerStarted","Data":"0d07ac6322aeacf33351771e25b816d3c68b6843df32997be984e47ee7c8ca8c"} Dec 01 09:10:23 crc kubenswrapper[4867]: I1201 09:10:23.942406 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:23 crc kubenswrapper[4867]: E1201 09:10:23.942866 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:24.442849695 +0000 UTC m=+145.902236449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.043322 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:24 crc kubenswrapper[4867]: E1201 09:10:24.044122 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:24.544101564 +0000 UTC m=+146.003488318 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.145012 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:24 crc kubenswrapper[4867]: E1201 09:10:24.145485 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:24.645473436 +0000 UTC m=+146.104860190 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.204964 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-znqxw"] Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.246613 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:24 crc kubenswrapper[4867]: E1201 09:10:24.246742 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:24.746724405 +0000 UTC m=+146.206111159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.247019 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:24 crc kubenswrapper[4867]: E1201 09:10:24.247400 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:24.747390194 +0000 UTC m=+146.206776948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.344654 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5wrtl"] Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.348507 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:24 crc kubenswrapper[4867]: E1201 09:10:24.348926 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:24.84890333 +0000 UTC m=+146.308290084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.411203 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.411470 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.436355 4867 patch_prober.go:28] interesting pod/apiserver-76f77b778f-4dj8h container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 01 09:10:24 crc kubenswrapper[4867]: [+]log ok Dec 01 09:10:24 crc kubenswrapper[4867]: [+]etcd ok Dec 01 09:10:24 crc kubenswrapper[4867]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 01 09:10:24 crc kubenswrapper[4867]: [+]poststarthook/generic-apiserver-start-informers ok Dec 01 09:10:24 crc kubenswrapper[4867]: [+]poststarthook/max-in-flight-filter ok Dec 01 09:10:24 crc kubenswrapper[4867]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 01 09:10:24 crc kubenswrapper[4867]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 01 09:10:24 crc kubenswrapper[4867]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 01 09:10:24 crc kubenswrapper[4867]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 01 09:10:24 crc kubenswrapper[4867]: [+]poststarthook/project.openshift.io-projectcache ok Dec 01 09:10:24 crc kubenswrapper[4867]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 01 09:10:24 crc kubenswrapper[4867]: [+]poststarthook/openshift.io-startinformers ok Dec 01 09:10:24 crc kubenswrapper[4867]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 01 09:10:24 crc kubenswrapper[4867]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 01 09:10:24 crc kubenswrapper[4867]: livez check failed Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.436402 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" podUID="8b075132-2629-49ad-9361-42fe48ae5b57" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.451721 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:24 crc kubenswrapper[4867]: E1201 09:10:24.454106 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:24.954084484 +0000 UTC m=+146.413471338 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.494939 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:24 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:24 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:24 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.495242 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.562447 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:24 crc kubenswrapper[4867]: E1201 09:10:24.562881 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:25.062862743 +0000 UTC m=+146.522249497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.663449 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:24 crc kubenswrapper[4867]: E1201 09:10:24.664014 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:25.164001488 +0000 UTC m=+146.623388242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.678585 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zr64t"] Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.712560 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.713186 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.731375 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.765322 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:24 crc kubenswrapper[4867]: E1201 09:10:24.766444 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:25.26642119 +0000 UTC m=+146.725808004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.852683 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nv8tc"] Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.853828 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nv8tc" Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.858098 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.867233 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:24 crc kubenswrapper[4867]: E1201 09:10:24.867604 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:25.367592596 +0000 UTC m=+146.826979350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.867958 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nv8tc"] Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.922020 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zr64t" event={"ID":"736579d0-4c1c-4766-abbd-4d681839bd0b","Type":"ContainerStarted","Data":"400e469a019b1504dcc80fcd8f3c0c491501374dbb5d62a64812181fbbf3bfa5"} Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.929129 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" event={"ID":"311f8a58-9f7d-4fbb-b6c2-462364f8ad76","Type":"ContainerStarted","Data":"d11fde16b2bb6fbdd17170df53d8ccd7eaf8a4078126fc6909149f3313341633"} Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.932240 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wrtl" event={"ID":"4702840c-d6fa-4dcd-bd95-4ac89f95d727","Type":"ContainerStarted","Data":"9f6a4d8538d9c6fff75c6a7f89a7bc8e7f85f9efa2fd7e4656a27452b29b0a6b"} Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.934613 4867 generic.go:334] "Generic (PLEG): container finished" podID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" containerID="1e6a2cb21c7807d8ebddf2cc9d4b495ed8119c95e2512bbabaeab908c8f5c488" exitCode=0 Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.934778 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-znqxw" event={"ID":"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88","Type":"ContainerDied","Data":"1e6a2cb21c7807d8ebddf2cc9d4b495ed8119c95e2512bbabaeab908c8f5c488"} Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.934872 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-znqxw" event={"ID":"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88","Type":"ContainerStarted","Data":"84f3a573996ef0f968e1a72c428fb33ef4b9a59ad4e8efa3205e1b189e419344"} Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.936887 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.938539 4867 generic.go:334] "Generic (PLEG): container finished" podID="5a77c808-2899-42d0-95e6-72f00df5432f" containerID="a14e51519db9554d4ae504206d354ab02b314118dbb920f9e6474bb5d3ef0403" exitCode=0 Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.938679 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfrqj" event={"ID":"5a77c808-2899-42d0-95e6-72f00df5432f","Type":"ContainerDied","Data":"a14e51519db9554d4ae504206d354ab02b314118dbb920f9e6474bb5d3ef0403"} Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.939232 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfrqj" event={"ID":"5a77c808-2899-42d0-95e6-72f00df5432f","Type":"ContainerStarted","Data":"2feef0f2d0d02f1d63d461d746b5bf27de96e8b42cdbad16b0aa35e9ed4f9693"} Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.956726 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s6z2w" Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.968346 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.968570 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/545d34e2-c5a8-48ab-9603-9ae4986ab739-catalog-content\") pod \"redhat-marketplace-nv8tc\" (UID: \"545d34e2-c5a8-48ab-9603-9ae4986ab739\") " pod="openshift-marketplace/redhat-marketplace-nv8tc" Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.968660 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfsbs\" (UniqueName: \"kubernetes.io/projected/545d34e2-c5a8-48ab-9603-9ae4986ab739-kube-api-access-rfsbs\") pod \"redhat-marketplace-nv8tc\" (UID: \"545d34e2-c5a8-48ab-9603-9ae4986ab739\") " pod="openshift-marketplace/redhat-marketplace-nv8tc" Dec 01 09:10:24 crc kubenswrapper[4867]: I1201 09:10:24.968731 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/545d34e2-c5a8-48ab-9603-9ae4986ab739-utilities\") pod \"redhat-marketplace-nv8tc\" (UID: \"545d34e2-c5a8-48ab-9603-9ae4986ab739\") " pod="openshift-marketplace/redhat-marketplace-nv8tc" Dec 01 09:10:24 crc kubenswrapper[4867]: E1201 09:10:24.968878 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:25.468853445 +0000 UTC m=+146.928240199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.069530 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/545d34e2-c5a8-48ab-9603-9ae4986ab739-catalog-content\") pod \"redhat-marketplace-nv8tc\" (UID: \"545d34e2-c5a8-48ab-9603-9ae4986ab739\") " pod="openshift-marketplace/redhat-marketplace-nv8tc" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.069638 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfsbs\" (UniqueName: \"kubernetes.io/projected/545d34e2-c5a8-48ab-9603-9ae4986ab739-kube-api-access-rfsbs\") pod \"redhat-marketplace-nv8tc\" (UID: \"545d34e2-c5a8-48ab-9603-9ae4986ab739\") " pod="openshift-marketplace/redhat-marketplace-nv8tc" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.069688 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.069735 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/545d34e2-c5a8-48ab-9603-9ae4986ab739-utilities\") pod \"redhat-marketplace-nv8tc\" (UID: \"545d34e2-c5a8-48ab-9603-9ae4986ab739\") " pod="openshift-marketplace/redhat-marketplace-nv8tc" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.070106 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/545d34e2-c5a8-48ab-9603-9ae4986ab739-catalog-content\") pod \"redhat-marketplace-nv8tc\" (UID: \"545d34e2-c5a8-48ab-9603-9ae4986ab739\") " pod="openshift-marketplace/redhat-marketplace-nv8tc" Dec 01 09:10:25 crc kubenswrapper[4867]: E1201 09:10:25.070446 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:25.570434683 +0000 UTC m=+147.029821437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.070898 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/545d34e2-c5a8-48ab-9603-9ae4986ab739-utilities\") pod \"redhat-marketplace-nv8tc\" (UID: \"545d34e2-c5a8-48ab-9603-9ae4986ab739\") " pod="openshift-marketplace/redhat-marketplace-nv8tc" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.084914 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kc82j" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.140792 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfsbs\" (UniqueName: \"kubernetes.io/projected/545d34e2-c5a8-48ab-9603-9ae4986ab739-kube-api-access-rfsbs\") pod \"redhat-marketplace-nv8tc\" (UID: \"545d34e2-c5a8-48ab-9603-9ae4986ab739\") " pod="openshift-marketplace/redhat-marketplace-nv8tc" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.174240 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:25 crc kubenswrapper[4867]: E1201 09:10:25.174761 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:25.67472905 +0000 UTC m=+147.134115804 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.229254 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nv8tc" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.276015 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jsrqd"] Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.277356 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jsrqd" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.298075 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:25 crc kubenswrapper[4867]: E1201 09:10:25.298421 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:25.798407036 +0000 UTC m=+147.257793790 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.315970 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jsrqd"] Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.367222 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.367866 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.368854 4867 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.371420 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.371597 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.399402 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.399752 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e7d46b-70cc-4b22-a041-3015e8b38f37-utilities\") pod \"redhat-marketplace-jsrqd\" (UID: \"e2e7d46b-70cc-4b22-a041-3015e8b38f37\") " pod="openshift-marketplace/redhat-marketplace-jsrqd" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.399788 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e7d46b-70cc-4b22-a041-3015e8b38f37-catalog-content\") pod \"redhat-marketplace-jsrqd\" (UID: \"e2e7d46b-70cc-4b22-a041-3015e8b38f37\") " pod="openshift-marketplace/redhat-marketplace-jsrqd" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.399845 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njnkr\" (UniqueName: \"kubernetes.io/projected/e2e7d46b-70cc-4b22-a041-3015e8b38f37-kube-api-access-njnkr\") pod \"redhat-marketplace-jsrqd\" (UID: \"e2e7d46b-70cc-4b22-a041-3015e8b38f37\") " pod="openshift-marketplace/redhat-marketplace-jsrqd" Dec 01 09:10:25 crc kubenswrapper[4867]: E1201 09:10:25.399981 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:25.899966283 +0000 UTC m=+147.359353037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.444030 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.504603 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:25 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:25 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:25 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.504655 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.505446 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16c05bfb-f0c5-41e1-9a46-415a141f5b53-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"16c05bfb-f0c5-41e1-9a46-415a141f5b53\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.505503 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e7d46b-70cc-4b22-a041-3015e8b38f37-utilities\") pod \"redhat-marketplace-jsrqd\" (UID: \"e2e7d46b-70cc-4b22-a041-3015e8b38f37\") " pod="openshift-marketplace/redhat-marketplace-jsrqd" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.505524 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e7d46b-70cc-4b22-a041-3015e8b38f37-catalog-content\") pod \"redhat-marketplace-jsrqd\" (UID: \"e2e7d46b-70cc-4b22-a041-3015e8b38f37\") " pod="openshift-marketplace/redhat-marketplace-jsrqd" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.505553 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16c05bfb-f0c5-41e1-9a46-415a141f5b53-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"16c05bfb-f0c5-41e1-9a46-415a141f5b53\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.505613 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njnkr\" (UniqueName: \"kubernetes.io/projected/e2e7d46b-70cc-4b22-a041-3015e8b38f37-kube-api-access-njnkr\") pod \"redhat-marketplace-jsrqd\" (UID: \"e2e7d46b-70cc-4b22-a041-3015e8b38f37\") " pod="openshift-marketplace/redhat-marketplace-jsrqd" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.505736 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:25 crc kubenswrapper[4867]: E1201 09:10:25.506087 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:26.006074624 +0000 UTC m=+147.465461378 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.506510 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e7d46b-70cc-4b22-a041-3015e8b38f37-utilities\") pod \"redhat-marketplace-jsrqd\" (UID: \"e2e7d46b-70cc-4b22-a041-3015e8b38f37\") " pod="openshift-marketplace/redhat-marketplace-jsrqd" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.506772 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e7d46b-70cc-4b22-a041-3015e8b38f37-catalog-content\") pod \"redhat-marketplace-jsrqd\" (UID: \"e2e7d46b-70cc-4b22-a041-3015e8b38f37\") " pod="openshift-marketplace/redhat-marketplace-jsrqd" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.556296 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njnkr\" (UniqueName: \"kubernetes.io/projected/e2e7d46b-70cc-4b22-a041-3015e8b38f37-kube-api-access-njnkr\") pod \"redhat-marketplace-jsrqd\" (UID: \"e2e7d46b-70cc-4b22-a041-3015e8b38f37\") " pod="openshift-marketplace/redhat-marketplace-jsrqd" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.609979 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.610149 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16c05bfb-f0c5-41e1-9a46-415a141f5b53-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"16c05bfb-f0c5-41e1-9a46-415a141f5b53\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.610174 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16c05bfb-f0c5-41e1-9a46-415a141f5b53-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"16c05bfb-f0c5-41e1-9a46-415a141f5b53\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.610207 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.610232 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.610253 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:10:25 crc kubenswrapper[4867]: E1201 09:10:25.613719 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:26.11369806 +0000 UTC m=+147.573084814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.613758 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16c05bfb-f0c5-41e1-9a46-415a141f5b53-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"16c05bfb-f0c5-41e1-9a46-415a141f5b53\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.616000 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.616997 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.619560 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.647712 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jsrqd" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.655417 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16c05bfb-f0c5-41e1-9a46-415a141f5b53-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"16c05bfb-f0c5-41e1-9a46-415a141f5b53\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.686900 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.712673 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.712799 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:25 crc kubenswrapper[4867]: E1201 09:10:25.713124 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:26.213112484 +0000 UTC m=+147.672499238 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.732599 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.745273 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.782044 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.782596 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.783465 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nv8tc"] Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.813415 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:25 crc kubenswrapper[4867]: E1201 09:10:25.813734 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:26.313708603 +0000 UTC m=+147.773095357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.839749 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.883699 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pwxb4"] Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.884867 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pwxb4" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.890671 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.903427 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pwxb4"] Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.915750 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9f8ccf-fbe9-42b0-84e5-b1913365a11f-catalog-content\") pod \"redhat-operators-pwxb4\" (UID: \"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f\") " pod="openshift-marketplace/redhat-operators-pwxb4" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.915883 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9f8ccf-fbe9-42b0-84e5-b1913365a11f-utilities\") pod \"redhat-operators-pwxb4\" (UID: \"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f\") " pod="openshift-marketplace/redhat-operators-pwxb4" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.915932 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsjmq\" (UniqueName: \"kubernetes.io/projected/8d9f8ccf-fbe9-42b0-84e5-b1913365a11f-kube-api-access-fsjmq\") pod \"redhat-operators-pwxb4\" (UID: \"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f\") " pod="openshift-marketplace/redhat-operators-pwxb4" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.915978 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:25 crc kubenswrapper[4867]: E1201 09:10:25.918412 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:26.418397922 +0000 UTC m=+147.877784676 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.937996 4867 patch_prober.go:28] interesting pod/downloads-7954f5f757-zh8bh container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.938051 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zh8bh" podUID="c699ddca-61b5-4f9a-ae7c-48653d9557f8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.939166 4867 patch_prober.go:28] interesting pod/downloads-7954f5f757-zh8bh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.939203 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zh8bh" podUID="c699ddca-61b5-4f9a-ae7c-48653d9557f8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.984067 4867 generic.go:334] "Generic (PLEG): container finished" podID="736579d0-4c1c-4766-abbd-4d681839bd0b" containerID="b15eb9582b0724ef1efbcbd45215495b0e0b19950f2e71bb810a73b1bc10ed16" exitCode=0 Dec 01 09:10:25 crc kubenswrapper[4867]: I1201 09:10:25.984139 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zr64t" event={"ID":"736579d0-4c1c-4766-abbd-4d681839bd0b","Type":"ContainerDied","Data":"b15eb9582b0724ef1efbcbd45215495b0e0b19950f2e71bb810a73b1bc10ed16"} Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.019536 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:26 crc kubenswrapper[4867]: E1201 09:10:26.020124 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:10:26.520101404 +0000 UTC m=+147.979488158 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.021546 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9f8ccf-fbe9-42b0-84e5-b1913365a11f-catalog-content\") pod \"redhat-operators-pwxb4\" (UID: \"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f\") " pod="openshift-marketplace/redhat-operators-pwxb4" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.021612 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9f8ccf-fbe9-42b0-84e5-b1913365a11f-utilities\") pod \"redhat-operators-pwxb4\" (UID: \"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f\") " pod="openshift-marketplace/redhat-operators-pwxb4" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.021637 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsjmq\" (UniqueName: \"kubernetes.io/projected/8d9f8ccf-fbe9-42b0-84e5-b1913365a11f-kube-api-access-fsjmq\") pod \"redhat-operators-pwxb4\" (UID: \"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f\") " pod="openshift-marketplace/redhat-operators-pwxb4" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.022771 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9f8ccf-fbe9-42b0-84e5-b1913365a11f-catalog-content\") pod \"redhat-operators-pwxb4\" (UID: \"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f\") " pod="openshift-marketplace/redhat-operators-pwxb4" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.023032 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9f8ccf-fbe9-42b0-84e5-b1913365a11f-utilities\") pod \"redhat-operators-pwxb4\" (UID: \"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f\") " pod="openshift-marketplace/redhat-operators-pwxb4" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.023457 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" event={"ID":"311f8a58-9f7d-4fbb-b6c2-462364f8ad76","Type":"ContainerStarted","Data":"4340e8eed05c6d411b8d592095c25a6364d7422d28318cb285355e5a93a24227"} Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.032068 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nv8tc" event={"ID":"545d34e2-c5a8-48ab-9603-9ae4986ab739","Type":"ContainerStarted","Data":"67c4a8b5a285b99aa43f1325bc3ee24138ef713e7629c0b8b46aa67d2fc460b3"} Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.049825 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsjmq\" (UniqueName: \"kubernetes.io/projected/8d9f8ccf-fbe9-42b0-84e5-b1913365a11f-kube-api-access-fsjmq\") pod \"redhat-operators-pwxb4\" (UID: \"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f\") " pod="openshift-marketplace/redhat-operators-pwxb4" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.050248 4867 generic.go:334] "Generic (PLEG): container finished" podID="4702840c-d6fa-4dcd-bd95-4ac89f95d727" containerID="e0321e082f0ccbb4f695fc11c61a8b8a4b3570e1b43d43d38853e5d84750ca20" exitCode=0 Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.050957 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wrtl" event={"ID":"4702840c-d6fa-4dcd-bd95-4ac89f95d727","Type":"ContainerDied","Data":"e0321e082f0ccbb4f695fc11c61a8b8a4b3570e1b43d43d38853e5d84750ca20"} Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.059635 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-4rxmf" podStartSLOduration=13.059616813 podStartE2EDuration="13.059616813s" podCreationTimestamp="2025-12-01 09:10:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:26.056433579 +0000 UTC m=+147.515820333" watchObservedRunningTime="2025-12-01 09:10:26.059616813 +0000 UTC m=+147.519003567" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.085905 4867 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-01T09:10:25.368869142Z","Handler":null,"Name":""} Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.124273 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:26 crc kubenswrapper[4867]: E1201 09:10:26.125378 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:10:26.62536312 +0000 UTC m=+148.084749874 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9k7d" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.129267 4867 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.129298 4867 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.194831 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.194868 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.196788 4867 patch_prober.go:28] interesting pod/console-f9d7485db-kdm2m container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.196845 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-kdm2m" podUID="ec06a7ff-9325-4de9-b47e-d8315761bf8d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.227487 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.257473 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.277372 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zcq6l"] Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.305390 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pwxb4" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.306128 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zcq6l" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.343104 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.347843 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zcq6l"] Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.437737 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c347bb68-7140-4b8d-ae43-ee55b581c961-utilities\") pod \"redhat-operators-zcq6l\" (UID: \"c347bb68-7140-4b8d-ae43-ee55b581c961\") " pod="openshift-marketplace/redhat-operators-zcq6l" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.437778 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghmkb\" (UniqueName: \"kubernetes.io/projected/c347bb68-7140-4b8d-ae43-ee55b581c961-kube-api-access-ghmkb\") pod \"redhat-operators-zcq6l\" (UID: \"c347bb68-7140-4b8d-ae43-ee55b581c961\") " pod="openshift-marketplace/redhat-operators-zcq6l" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.437832 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.437897 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c347bb68-7140-4b8d-ae43-ee55b581c961-catalog-content\") pod \"redhat-operators-zcq6l\" (UID: \"c347bb68-7140-4b8d-ae43-ee55b581c961\") " pod="openshift-marketplace/redhat-operators-zcq6l" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.451298 4867 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.451331 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.488467 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.533261 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:26 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:26 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:26 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.533313 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.539477 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c347bb68-7140-4b8d-ae43-ee55b581c961-catalog-content\") pod \"redhat-operators-zcq6l\" (UID: \"c347bb68-7140-4b8d-ae43-ee55b581c961\") " pod="openshift-marketplace/redhat-operators-zcq6l" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.539616 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c347bb68-7140-4b8d-ae43-ee55b581c961-utilities\") pod \"redhat-operators-zcq6l\" (UID: \"c347bb68-7140-4b8d-ae43-ee55b581c961\") " pod="openshift-marketplace/redhat-operators-zcq6l" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.539638 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghmkb\" (UniqueName: \"kubernetes.io/projected/c347bb68-7140-4b8d-ae43-ee55b581c961-kube-api-access-ghmkb\") pod \"redhat-operators-zcq6l\" (UID: \"c347bb68-7140-4b8d-ae43-ee55b581c961\") " pod="openshift-marketplace/redhat-operators-zcq6l" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.540902 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c347bb68-7140-4b8d-ae43-ee55b581c961-catalog-content\") pod \"redhat-operators-zcq6l\" (UID: \"c347bb68-7140-4b8d-ae43-ee55b581c961\") " pod="openshift-marketplace/redhat-operators-zcq6l" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.541414 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c347bb68-7140-4b8d-ae43-ee55b581c961-utilities\") pod \"redhat-operators-zcq6l\" (UID: \"c347bb68-7140-4b8d-ae43-ee55b581c961\") " pod="openshift-marketplace/redhat-operators-zcq6l" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.572105 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghmkb\" (UniqueName: \"kubernetes.io/projected/c347bb68-7140-4b8d-ae43-ee55b581c961-kube-api-access-ghmkb\") pod \"redhat-operators-zcq6l\" (UID: \"c347bb68-7140-4b8d-ae43-ee55b581c961\") " pod="openshift-marketplace/redhat-operators-zcq6l" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.668226 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zcq6l" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.686214 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9k7d\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.806973 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jsrqd"] Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.874539 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 01 09:10:26 crc kubenswrapper[4867]: I1201 09:10:26.936322 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:27 crc kubenswrapper[4867]: I1201 09:10:27.083038 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 09:10:27 crc kubenswrapper[4867]: I1201 09:10:27.086922 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b125e30200ab9a0b1e9bca7084d13e1838a4fb9c4289b787dfe251294e7404bd"} Dec 01 09:10:27 crc kubenswrapper[4867]: I1201 09:10:27.086972 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8cbb7e0fa0e7ef9a76c07898a175b2923f9b8302d5c06feb6a01444878f0d672"} Dec 01 09:10:27 crc kubenswrapper[4867]: I1201 09:10:27.107134 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b27866633ce7e5e88ae3cc1e21cdf02780ffe7267d3d692fd38e196f6efcceb5"} Dec 01 09:10:27 crc kubenswrapper[4867]: I1201 09:10:27.109605 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ac5cdda1b0909217173a53844f9ef3886f76fe52110f686c517c55ec280998d4"} Dec 01 09:10:27 crc kubenswrapper[4867]: I1201 09:10:27.135285 4867 generic.go:334] "Generic (PLEG): container finished" podID="545d34e2-c5a8-48ab-9603-9ae4986ab739" containerID="891697f4dd90138e0c27340381b9112b99bb78dd61819e53edccc2c5b4070a99" exitCode=0 Dec 01 09:10:27 crc kubenswrapper[4867]: I1201 09:10:27.135410 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nv8tc" event={"ID":"545d34e2-c5a8-48ab-9603-9ae4986ab739","Type":"ContainerDied","Data":"891697f4dd90138e0c27340381b9112b99bb78dd61819e53edccc2c5b4070a99"} Dec 01 09:10:27 crc kubenswrapper[4867]: I1201 09:10:27.152868 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsrqd" event={"ID":"e2e7d46b-70cc-4b22-a041-3015e8b38f37","Type":"ContainerStarted","Data":"42eabb49921acd18b972ed5e8ef3d1e5787f0a5da04f825b4ff79f09e8650e01"} Dec 01 09:10:27 crc kubenswrapper[4867]: I1201 09:10:27.250952 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pwxb4"] Dec 01 09:10:27 crc kubenswrapper[4867]: I1201 09:10:27.315421 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zcq6l"] Dec 01 09:10:27 crc kubenswrapper[4867]: I1201 09:10:27.491513 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:27 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:27 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:27 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:27 crc kubenswrapper[4867]: I1201 09:10:27.491793 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:27 crc kubenswrapper[4867]: I1201 09:10:27.658353 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-n9k7d"] Dec 01 09:10:27 crc kubenswrapper[4867]: W1201 09:10:27.697470 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd00d9bfd_cd31_44f5_8b56_d14af3823d29.slice/crio-3bed272814f2d34bd9f8a900983de2d953eabbd2a6e32418cda3103b6e2f789d WatchSource:0}: Error finding container 3bed272814f2d34bd9f8a900983de2d953eabbd2a6e32418cda3103b6e2f789d: Status 404 returned error can't find the container with id 3bed272814f2d34bd9f8a900983de2d953eabbd2a6e32418cda3103b6e2f789d Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.193969 4867 generic.go:334] "Generic (PLEG): container finished" podID="e2e7d46b-70cc-4b22-a041-3015e8b38f37" containerID="0ba8c9f979ddfdf17815e7484f6f7316c578a866ba658bdee309904bf7347ca1" exitCode=0 Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.194061 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsrqd" event={"ID":"e2e7d46b-70cc-4b22-a041-3015e8b38f37","Type":"ContainerDied","Data":"0ba8c9f979ddfdf17815e7484f6f7316c578a866ba658bdee309904bf7347ca1"} Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.201834 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"16c05bfb-f0c5-41e1-9a46-415a141f5b53","Type":"ContainerStarted","Data":"4746ba6346def1e30ccea6449292f896f90d709b857bf8986b82462e6f1f5c45"} Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.201881 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"16c05bfb-f0c5-41e1-9a46-415a141f5b53","Type":"ContainerStarted","Data":"ed8d12c39dea4e964b3a64b1bfa8f0e5e14839f8f5ad627289857213e4f23dac"} Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.239948 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"90dc7b8c95ff9b8dcced5a3c6b47a4f819755c31cb21ee7ceb6288f61146f1b6"} Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.248773 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.248759711 podStartE2EDuration="3.248759711s" podCreationTimestamp="2025-12-01 09:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:28.24836977 +0000 UTC m=+149.707756524" watchObservedRunningTime="2025-12-01 09:10:28.248759711 +0000 UTC m=+149.708146465" Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.251473 4867 generic.go:334] "Generic (PLEG): container finished" podID="c347bb68-7140-4b8d-ae43-ee55b581c961" containerID="55fd3f479ad5a6ed9653ecf015decb675d4c42e5bc5b16d69de43109e51411e1" exitCode=0 Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.251726 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcq6l" event={"ID":"c347bb68-7140-4b8d-ae43-ee55b581c961","Type":"ContainerDied","Data":"55fd3f479ad5a6ed9653ecf015decb675d4c42e5bc5b16d69de43109e51411e1"} Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.251780 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcq6l" event={"ID":"c347bb68-7140-4b8d-ae43-ee55b581c961","Type":"ContainerStarted","Data":"fa89d4da9867e4f58df7a5818221a7b3afd90ad080d230d250fd8528e489b0e2"} Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.263527 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ba1421d84c7e06b414b62bc6156eff9deaeb8235423e0e44e73bd5fd3fca6635"} Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.264260 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.267710 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.268323 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.274684 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.274913 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.280582 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" event={"ID":"d00d9bfd-cd31-44f5-8b56-d14af3823d29","Type":"ContainerStarted","Data":"3bed272814f2d34bd9f8a900983de2d953eabbd2a6e32418cda3103b6e2f789d"} Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.281227 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.312872 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.313207 4867 generic.go:334] "Generic (PLEG): container finished" podID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" containerID="4ebc94b6d0bf73529fbfcfeaa585252e1ebaf9e3938f7ed203390fb9e097fd01" exitCode=0 Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.313249 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwxb4" event={"ID":"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f","Type":"ContainerDied","Data":"4ebc94b6d0bf73529fbfcfeaa585252e1ebaf9e3938f7ed203390fb9e097fd01"} Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.313274 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwxb4" event={"ID":"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f","Type":"ContainerStarted","Data":"140d0e133ff553f42773c657ec6f6a5037a4e5efb5f3f4ced27e47afea6a621f"} Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.413096 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a09f8b1-f6c9-41b2-a36b-0cf326f38b91-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4a09f8b1-f6c9-41b2-a36b-0cf326f38b91\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.413261 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a09f8b1-f6c9-41b2-a36b-0cf326f38b91-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4a09f8b1-f6c9-41b2-a36b-0cf326f38b91\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.432371 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" podStartSLOduration=130.432354674 podStartE2EDuration="2m10.432354674s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:10:28.430078307 +0000 UTC m=+149.889465061" watchObservedRunningTime="2025-12-01 09:10:28.432354674 +0000 UTC m=+149.891741428" Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.495222 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:28 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:28 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:28 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.495296 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.514805 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a09f8b1-f6c9-41b2-a36b-0cf326f38b91-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4a09f8b1-f6c9-41b2-a36b-0cf326f38b91\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.514923 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a09f8b1-f6c9-41b2-a36b-0cf326f38b91-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4a09f8b1-f6c9-41b2-a36b-0cf326f38b91\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.514996 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a09f8b1-f6c9-41b2-a36b-0cf326f38b91-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4a09f8b1-f6c9-41b2-a36b-0cf326f38b91\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.556427 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a09f8b1-f6c9-41b2-a36b-0cf326f38b91-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4a09f8b1-f6c9-41b2-a36b-0cf326f38b91\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 09:10:28 crc kubenswrapper[4867]: I1201 09:10:28.595903 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 09:10:29 crc kubenswrapper[4867]: I1201 09:10:29.057413 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 09:10:29 crc kubenswrapper[4867]: I1201 09:10:29.346856 4867 generic.go:334] "Generic (PLEG): container finished" podID="16c05bfb-f0c5-41e1-9a46-415a141f5b53" containerID="4746ba6346def1e30ccea6449292f896f90d709b857bf8986b82462e6f1f5c45" exitCode=0 Dec 01 09:10:29 crc kubenswrapper[4867]: I1201 09:10:29.347158 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"16c05bfb-f0c5-41e1-9a46-415a141f5b53","Type":"ContainerDied","Data":"4746ba6346def1e30ccea6449292f896f90d709b857bf8986b82462e6f1f5c45"} Dec 01 09:10:29 crc kubenswrapper[4867]: I1201 09:10:29.377742 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4a09f8b1-f6c9-41b2-a36b-0cf326f38b91","Type":"ContainerStarted","Data":"fce0959ca736801e3f2b2926f92f2bd7e4ea3b4d37484b32d567af6228c763fe"} Dec 01 09:10:29 crc kubenswrapper[4867]: I1201 09:10:29.413781 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:29 crc kubenswrapper[4867]: I1201 09:10:29.413846 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" event={"ID":"d00d9bfd-cd31-44f5-8b56-d14af3823d29","Type":"ContainerStarted","Data":"7c1e50ad11ac1bee0583425fd2316089f6a208a1d119f1e1ea566a46f8b55298"} Dec 01 09:10:29 crc kubenswrapper[4867]: I1201 09:10:29.421401 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-4dj8h" Dec 01 09:10:29 crc kubenswrapper[4867]: I1201 09:10:29.490306 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:29 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:29 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:29 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:29 crc kubenswrapper[4867]: I1201 09:10:29.490383 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:30 crc kubenswrapper[4867]: I1201 09:10:30.424564 4867 generic.go:334] "Generic (PLEG): container finished" podID="45e5dcb3-d55f-40cf-a89f-3367e84322d1" containerID="8f899fc084ad650caac3f1299eb696d18ff910f8c08ff8465a177917e24b2f4e" exitCode=0 Dec 01 09:10:30 crc kubenswrapper[4867]: I1201 09:10:30.424643 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h" event={"ID":"45e5dcb3-d55f-40cf-a89f-3367e84322d1","Type":"ContainerDied","Data":"8f899fc084ad650caac3f1299eb696d18ff910f8c08ff8465a177917e24b2f4e"} Dec 01 09:10:30 crc kubenswrapper[4867]: I1201 09:10:30.495345 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:30 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:30 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:30 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:30 crc kubenswrapper[4867]: I1201 09:10:30.495450 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:30 crc kubenswrapper[4867]: I1201 09:10:30.831556 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 09:10:30 crc kubenswrapper[4867]: I1201 09:10:30.976306 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16c05bfb-f0c5-41e1-9a46-415a141f5b53-kube-api-access\") pod \"16c05bfb-f0c5-41e1-9a46-415a141f5b53\" (UID: \"16c05bfb-f0c5-41e1-9a46-415a141f5b53\") " Dec 01 09:10:30 crc kubenswrapper[4867]: I1201 09:10:30.976456 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16c05bfb-f0c5-41e1-9a46-415a141f5b53-kubelet-dir\") pod \"16c05bfb-f0c5-41e1-9a46-415a141f5b53\" (UID: \"16c05bfb-f0c5-41e1-9a46-415a141f5b53\") " Dec 01 09:10:30 crc kubenswrapper[4867]: I1201 09:10:30.976591 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16c05bfb-f0c5-41e1-9a46-415a141f5b53-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "16c05bfb-f0c5-41e1-9a46-415a141f5b53" (UID: "16c05bfb-f0c5-41e1-9a46-415a141f5b53"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:10:31 crc kubenswrapper[4867]: I1201 09:10:30.987057 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16c05bfb-f0c5-41e1-9a46-415a141f5b53-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "16c05bfb-f0c5-41e1-9a46-415a141f5b53" (UID: "16c05bfb-f0c5-41e1-9a46-415a141f5b53"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:10:31 crc kubenswrapper[4867]: I1201 09:10:31.078230 4867 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16c05bfb-f0c5-41e1-9a46-415a141f5b53-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 09:10:31 crc kubenswrapper[4867]: I1201 09:10:31.078269 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16c05bfb-f0c5-41e1-9a46-415a141f5b53-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 09:10:31 crc kubenswrapper[4867]: I1201 09:10:31.500557 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:31 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:31 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:31 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:31 crc kubenswrapper[4867]: I1201 09:10:31.500613 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:31 crc kubenswrapper[4867]: I1201 09:10:31.551433 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"16c05bfb-f0c5-41e1-9a46-415a141f5b53","Type":"ContainerDied","Data":"ed8d12c39dea4e964b3a64b1bfa8f0e5e14839f8f5ad627289857213e4f23dac"} Dec 01 09:10:31 crc kubenswrapper[4867]: I1201 09:10:31.551492 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed8d12c39dea4e964b3a64b1bfa8f0e5e14839f8f5ad627289857213e4f23dac" Dec 01 09:10:31 crc kubenswrapper[4867]: I1201 09:10:31.551597 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 09:10:31 crc kubenswrapper[4867]: I1201 09:10:31.556647 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4a09f8b1-f6c9-41b2-a36b-0cf326f38b91","Type":"ContainerStarted","Data":"cf42519ae21c95983d4dd6bcb49c643398661db5dfb48f80d4e4167207cc0fb3"} Dec 01 09:10:31 crc kubenswrapper[4867]: I1201 09:10:31.607509 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-v6frd" Dec 01 09:10:32 crc kubenswrapper[4867]: I1201 09:10:32.029256 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h" Dec 01 09:10:32 crc kubenswrapper[4867]: I1201 09:10:32.204087 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45e5dcb3-d55f-40cf-a89f-3367e84322d1-secret-volume\") pod \"45e5dcb3-d55f-40cf-a89f-3367e84322d1\" (UID: \"45e5dcb3-d55f-40cf-a89f-3367e84322d1\") " Dec 01 09:10:32 crc kubenswrapper[4867]: I1201 09:10:32.204196 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9mnf\" (UniqueName: \"kubernetes.io/projected/45e5dcb3-d55f-40cf-a89f-3367e84322d1-kube-api-access-k9mnf\") pod \"45e5dcb3-d55f-40cf-a89f-3367e84322d1\" (UID: \"45e5dcb3-d55f-40cf-a89f-3367e84322d1\") " Dec 01 09:10:32 crc kubenswrapper[4867]: I1201 09:10:32.204235 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45e5dcb3-d55f-40cf-a89f-3367e84322d1-config-volume\") pod \"45e5dcb3-d55f-40cf-a89f-3367e84322d1\" (UID: \"45e5dcb3-d55f-40cf-a89f-3367e84322d1\") " Dec 01 09:10:32 crc kubenswrapper[4867]: I1201 09:10:32.205521 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45e5dcb3-d55f-40cf-a89f-3367e84322d1-config-volume" (OuterVolumeSpecName: "config-volume") pod "45e5dcb3-d55f-40cf-a89f-3367e84322d1" (UID: "45e5dcb3-d55f-40cf-a89f-3367e84322d1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:10:32 crc kubenswrapper[4867]: I1201 09:10:32.210100 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e5dcb3-d55f-40cf-a89f-3367e84322d1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "45e5dcb3-d55f-40cf-a89f-3367e84322d1" (UID: "45e5dcb3-d55f-40cf-a89f-3367e84322d1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:10:32 crc kubenswrapper[4867]: I1201 09:10:32.221283 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45e5dcb3-d55f-40cf-a89f-3367e84322d1-kube-api-access-k9mnf" (OuterVolumeSpecName: "kube-api-access-k9mnf") pod "45e5dcb3-d55f-40cf-a89f-3367e84322d1" (UID: "45e5dcb3-d55f-40cf-a89f-3367e84322d1"). InnerVolumeSpecName "kube-api-access-k9mnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:10:32 crc kubenswrapper[4867]: I1201 09:10:32.306569 4867 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45e5dcb3-d55f-40cf-a89f-3367e84322d1-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:10:32 crc kubenswrapper[4867]: I1201 09:10:32.306617 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9mnf\" (UniqueName: \"kubernetes.io/projected/45e5dcb3-d55f-40cf-a89f-3367e84322d1-kube-api-access-k9mnf\") on node \"crc\" DevicePath \"\"" Dec 01 09:10:32 crc kubenswrapper[4867]: I1201 09:10:32.306633 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45e5dcb3-d55f-40cf-a89f-3367e84322d1-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:10:32 crc kubenswrapper[4867]: I1201 09:10:32.495234 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:32 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:32 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:32 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:32 crc kubenswrapper[4867]: I1201 09:10:32.495306 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:32 crc kubenswrapper[4867]: I1201 09:10:32.584455 4867 generic.go:334] "Generic (PLEG): container finished" podID="4a09f8b1-f6c9-41b2-a36b-0cf326f38b91" containerID="cf42519ae21c95983d4dd6bcb49c643398661db5dfb48f80d4e4167207cc0fb3" exitCode=0 Dec 01 09:10:32 crc kubenswrapper[4867]: I1201 09:10:32.584511 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4a09f8b1-f6c9-41b2-a36b-0cf326f38b91","Type":"ContainerDied","Data":"cf42519ae21c95983d4dd6bcb49c643398661db5dfb48f80d4e4167207cc0fb3"} Dec 01 09:10:32 crc kubenswrapper[4867]: I1201 09:10:32.596853 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h" event={"ID":"45e5dcb3-d55f-40cf-a89f-3367e84322d1","Type":"ContainerDied","Data":"353d5f2169c28ef07dca964e4824a37cd934bf5c994c2aa49ab2cc35e516c7f5"} Dec 01 09:10:32 crc kubenswrapper[4867]: I1201 09:10:32.596892 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="353d5f2169c28ef07dca964e4824a37cd934bf5c994c2aa49ab2cc35e516c7f5" Dec 01 09:10:32 crc kubenswrapper[4867]: I1201 09:10:32.596895 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h" Dec 01 09:10:32 crc kubenswrapper[4867]: I1201 09:10:32.925464 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 09:10:33 crc kubenswrapper[4867]: I1201 09:10:33.020462 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a09f8b1-f6c9-41b2-a36b-0cf326f38b91-kubelet-dir\") pod \"4a09f8b1-f6c9-41b2-a36b-0cf326f38b91\" (UID: \"4a09f8b1-f6c9-41b2-a36b-0cf326f38b91\") " Dec 01 09:10:33 crc kubenswrapper[4867]: I1201 09:10:33.020557 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a09f8b1-f6c9-41b2-a36b-0cf326f38b91-kube-api-access\") pod \"4a09f8b1-f6c9-41b2-a36b-0cf326f38b91\" (UID: \"4a09f8b1-f6c9-41b2-a36b-0cf326f38b91\") " Dec 01 09:10:33 crc kubenswrapper[4867]: I1201 09:10:33.020606 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a09f8b1-f6c9-41b2-a36b-0cf326f38b91-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4a09f8b1-f6c9-41b2-a36b-0cf326f38b91" (UID: "4a09f8b1-f6c9-41b2-a36b-0cf326f38b91"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:10:33 crc kubenswrapper[4867]: I1201 09:10:33.020892 4867 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a09f8b1-f6c9-41b2-a36b-0cf326f38b91-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 09:10:33 crc kubenswrapper[4867]: I1201 09:10:33.025474 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a09f8b1-f6c9-41b2-a36b-0cf326f38b91-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4a09f8b1-f6c9-41b2-a36b-0cf326f38b91" (UID: "4a09f8b1-f6c9-41b2-a36b-0cf326f38b91"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:10:33 crc kubenswrapper[4867]: I1201 09:10:33.122285 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a09f8b1-f6c9-41b2-a36b-0cf326f38b91-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 09:10:33 crc kubenswrapper[4867]: I1201 09:10:33.490378 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:33 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:33 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:33 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:33 crc kubenswrapper[4867]: I1201 09:10:33.490679 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:33 crc kubenswrapper[4867]: I1201 09:10:33.625273 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 09:10:33 crc kubenswrapper[4867]: I1201 09:10:33.625317 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4a09f8b1-f6c9-41b2-a36b-0cf326f38b91","Type":"ContainerDied","Data":"fce0959ca736801e3f2b2926f92f2bd7e4ea3b4d37484b32d567af6228c763fe"} Dec 01 09:10:33 crc kubenswrapper[4867]: I1201 09:10:33.625367 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fce0959ca736801e3f2b2926f92f2bd7e4ea3b4d37484b32d567af6228c763fe" Dec 01 09:10:34 crc kubenswrapper[4867]: I1201 09:10:34.489577 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:34 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:34 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:34 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:34 crc kubenswrapper[4867]: I1201 09:10:34.489651 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:35 crc kubenswrapper[4867]: I1201 09:10:35.489506 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:35 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:35 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:35 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:35 crc kubenswrapper[4867]: I1201 09:10:35.489550 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:35 crc kubenswrapper[4867]: I1201 09:10:35.940632 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-zh8bh" Dec 01 09:10:36 crc kubenswrapper[4867]: I1201 09:10:36.195160 4867 patch_prober.go:28] interesting pod/console-f9d7485db-kdm2m container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Dec 01 09:10:36 crc kubenswrapper[4867]: I1201 09:10:36.195427 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-kdm2m" podUID="ec06a7ff-9325-4de9-b47e-d8315761bf8d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" Dec 01 09:10:36 crc kubenswrapper[4867]: I1201 09:10:36.489890 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:36 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:36 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:36 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:36 crc kubenswrapper[4867]: I1201 09:10:36.489943 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:37 crc kubenswrapper[4867]: I1201 09:10:37.490183 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:37 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:37 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:37 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:37 crc kubenswrapper[4867]: I1201 09:10:37.490240 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:38 crc kubenswrapper[4867]: I1201 09:10:38.489133 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:38 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:38 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:38 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:38 crc kubenswrapper[4867]: I1201 09:10:38.489194 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:39 crc kubenswrapper[4867]: I1201 09:10:39.489742 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:39 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:39 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:39 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:39 crc kubenswrapper[4867]: I1201 09:10:39.490072 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:39 crc kubenswrapper[4867]: I1201 09:10:39.760592 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs\") pod \"network-metrics-daemon-n7wvd\" (UID: \"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\") " pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:10:39 crc kubenswrapper[4867]: I1201 09:10:39.765646 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3ff1be1-b98b-483b-83ca-eb2255f66c7c-metrics-certs\") pod \"network-metrics-daemon-n7wvd\" (UID: \"c3ff1be1-b98b-483b-83ca-eb2255f66c7c\") " pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:10:39 crc kubenswrapper[4867]: I1201 09:10:39.848767 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wvd" Dec 01 09:10:40 crc kubenswrapper[4867]: I1201 09:10:40.489617 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:40 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:40 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:40 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:40 crc kubenswrapper[4867]: I1201 09:10:40.489915 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:41 crc kubenswrapper[4867]: I1201 09:10:41.489758 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:41 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:41 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:41 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:41 crc kubenswrapper[4867]: I1201 09:10:41.489849 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:42 crc kubenswrapper[4867]: I1201 09:10:42.489451 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:10:42 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Dec 01 09:10:42 crc kubenswrapper[4867]: [+]process-running ok Dec 01 09:10:42 crc kubenswrapper[4867]: healthz check failed Dec 01 09:10:42 crc kubenswrapper[4867]: I1201 09:10:42.489511 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:10:43 crc kubenswrapper[4867]: I1201 09:10:43.490280 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:43 crc kubenswrapper[4867]: I1201 09:10:43.493538 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-l44jd" Dec 01 09:10:46 crc kubenswrapper[4867]: I1201 09:10:46.195225 4867 patch_prober.go:28] interesting pod/console-f9d7485db-kdm2m container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Dec 01 09:10:46 crc kubenswrapper[4867]: I1201 09:10:46.195605 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-kdm2m" podUID="ec06a7ff-9325-4de9-b47e-d8315761bf8d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" Dec 01 09:10:46 crc kubenswrapper[4867]: I1201 09:10:46.946038 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:10:51 crc kubenswrapper[4867]: I1201 09:10:51.601388 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:10:51 crc kubenswrapper[4867]: I1201 09:10:51.601725 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:10:55 crc kubenswrapper[4867]: I1201 09:10:55.868612 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc54g" Dec 01 09:10:56 crc kubenswrapper[4867]: I1201 09:10:56.198751 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:56 crc kubenswrapper[4867]: I1201 09:10:56.202608 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:10:57 crc kubenswrapper[4867]: I1201 09:10:57.529476 4867 patch_prober.go:28] interesting pod/router-default-5444994796-l44jd container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 09:10:57 crc kubenswrapper[4867]: I1201 09:10:57.530911 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-l44jd" podUID="5f2052f6-d2cd-4aba-b254-bea0cb7b6aba" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:11:03 crc kubenswrapper[4867]: E1201 09:11:03.002607 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 01 09:11:03 crc kubenswrapper[4867]: E1201 09:11:03.003299 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-njnkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-jsrqd_openshift-marketplace(e2e7d46b-70cc-4b22-a041-3015e8b38f37): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 09:11:03 crc kubenswrapper[4867]: E1201 09:11:03.004465 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-jsrqd" podUID="e2e7d46b-70cc-4b22-a041-3015e8b38f37" Dec 01 09:11:03 crc kubenswrapper[4867]: E1201 09:11:03.026683 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 01 09:11:03 crc kubenswrapper[4867]: E1201 09:11:03.027212 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rfsbs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-nv8tc_openshift-marketplace(545d34e2-c5a8-48ab-9603-9ae4986ab739): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 09:11:03 crc kubenswrapper[4867]: E1201 09:11:03.029449 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-nv8tc" podUID="545d34e2-c5a8-48ab-9603-9ae4986ab739" Dec 01 09:11:04 crc kubenswrapper[4867]: I1201 09:11:04.151688 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 09:11:04 crc kubenswrapper[4867]: E1201 09:11:04.171253 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c05bfb-f0c5-41e1-9a46-415a141f5b53" containerName="pruner" Dec 01 09:11:04 crc kubenswrapper[4867]: I1201 09:11:04.171298 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c05bfb-f0c5-41e1-9a46-415a141f5b53" containerName="pruner" Dec 01 09:11:04 crc kubenswrapper[4867]: E1201 09:11:04.171920 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a09f8b1-f6c9-41b2-a36b-0cf326f38b91" containerName="pruner" Dec 01 09:11:04 crc kubenswrapper[4867]: I1201 09:11:04.171955 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a09f8b1-f6c9-41b2-a36b-0cf326f38b91" containerName="pruner" Dec 01 09:11:04 crc kubenswrapper[4867]: E1201 09:11:04.171982 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e5dcb3-d55f-40cf-a89f-3367e84322d1" containerName="collect-profiles" Dec 01 09:11:04 crc kubenswrapper[4867]: I1201 09:11:04.171992 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e5dcb3-d55f-40cf-a89f-3367e84322d1" containerName="collect-profiles" Dec 01 09:11:04 crc kubenswrapper[4867]: I1201 09:11:04.172165 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a09f8b1-f6c9-41b2-a36b-0cf326f38b91" containerName="pruner" Dec 01 09:11:04 crc kubenswrapper[4867]: I1201 09:11:04.172192 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="16c05bfb-f0c5-41e1-9a46-415a141f5b53" containerName="pruner" Dec 01 09:11:04 crc kubenswrapper[4867]: I1201 09:11:04.172207 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e5dcb3-d55f-40cf-a89f-3367e84322d1" containerName="collect-profiles" Dec 01 09:11:04 crc kubenswrapper[4867]: I1201 09:11:04.172604 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 09:11:04 crc kubenswrapper[4867]: I1201 09:11:04.172780 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 09:11:04 crc kubenswrapper[4867]: I1201 09:11:04.176498 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 09:11:04 crc kubenswrapper[4867]: I1201 09:11:04.177161 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 09:11:04 crc kubenswrapper[4867]: I1201 09:11:04.270074 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38b87dcb-a910-4b50-b784-e5c8fa7c4669-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"38b87dcb-a910-4b50-b784-e5c8fa7c4669\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 09:11:04 crc kubenswrapper[4867]: I1201 09:11:04.270160 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38b87dcb-a910-4b50-b784-e5c8fa7c4669-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"38b87dcb-a910-4b50-b784-e5c8fa7c4669\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 09:11:04 crc kubenswrapper[4867]: I1201 09:11:04.371861 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38b87dcb-a910-4b50-b784-e5c8fa7c4669-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"38b87dcb-a910-4b50-b784-e5c8fa7c4669\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 09:11:04 crc kubenswrapper[4867]: I1201 09:11:04.371935 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38b87dcb-a910-4b50-b784-e5c8fa7c4669-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"38b87dcb-a910-4b50-b784-e5c8fa7c4669\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 09:11:04 crc kubenswrapper[4867]: I1201 09:11:04.372055 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38b87dcb-a910-4b50-b784-e5c8fa7c4669-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"38b87dcb-a910-4b50-b784-e5c8fa7c4669\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 09:11:04 crc kubenswrapper[4867]: I1201 09:11:04.393612 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38b87dcb-a910-4b50-b784-e5c8fa7c4669-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"38b87dcb-a910-4b50-b784-e5c8fa7c4669\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 09:11:04 crc kubenswrapper[4867]: E1201 09:11:04.478322 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-jsrqd" podUID="e2e7d46b-70cc-4b22-a041-3015e8b38f37" Dec 01 09:11:04 crc kubenswrapper[4867]: E1201 09:11:04.478600 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-nv8tc" podUID="545d34e2-c5a8-48ab-9603-9ae4986ab739" Dec 01 09:11:04 crc kubenswrapper[4867]: I1201 09:11:04.502087 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 09:11:04 crc kubenswrapper[4867]: E1201 09:11:04.545108 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 09:11:04 crc kubenswrapper[4867]: E1201 09:11:04.545268 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-65b95,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5wrtl_openshift-marketplace(4702840c-d6fa-4dcd-bd95-4ac89f95d727): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 09:11:04 crc kubenswrapper[4867]: E1201 09:11:04.546475 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5wrtl" podUID="4702840c-d6fa-4dcd-bd95-4ac89f95d727" Dec 01 09:11:04 crc kubenswrapper[4867]: E1201 09:11:04.564924 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 09:11:04 crc kubenswrapper[4867]: E1201 09:11:04.565120 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r2pfj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-zr64t_openshift-marketplace(736579d0-4c1c-4766-abbd-4d681839bd0b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 09:11:04 crc kubenswrapper[4867]: E1201 09:11:04.566801 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-zr64t" podUID="736579d0-4c1c-4766-abbd-4d681839bd0b" Dec 01 09:11:05 crc kubenswrapper[4867]: I1201 09:11:05.788239 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:11:06 crc kubenswrapper[4867]: E1201 09:11:06.276692 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5wrtl" podUID="4702840c-d6fa-4dcd-bd95-4ac89f95d727" Dec 01 09:11:06 crc kubenswrapper[4867]: E1201 09:11:06.276969 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-zr64t" podUID="736579d0-4c1c-4766-abbd-4d681839bd0b" Dec 01 09:11:06 crc kubenswrapper[4867]: E1201 09:11:06.343522 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 01 09:11:06 crc kubenswrapper[4867]: E1201 09:11:06.343678 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8h4gh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-kfrqj_openshift-marketplace(5a77c808-2899-42d0-95e6-72f00df5432f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 09:11:06 crc kubenswrapper[4867]: E1201 09:11:06.344829 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-kfrqj" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" Dec 01 09:11:07 crc kubenswrapper[4867]: I1201 09:11:07.400211 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p989q"] Dec 01 09:11:08 crc kubenswrapper[4867]: I1201 09:11:08.544502 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 09:11:08 crc kubenswrapper[4867]: I1201 09:11:08.545151 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 09:11:08 crc kubenswrapper[4867]: I1201 09:11:08.545218 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:11:08 crc kubenswrapper[4867]: I1201 09:11:08.638857 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c55c2a62-535e-4781-a610-4eeea00a871c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c55c2a62-535e-4781-a610-4eeea00a871c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:11:08 crc kubenswrapper[4867]: I1201 09:11:08.638945 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c55c2a62-535e-4781-a610-4eeea00a871c-kube-api-access\") pod \"installer-9-crc\" (UID: \"c55c2a62-535e-4781-a610-4eeea00a871c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:11:08 crc kubenswrapper[4867]: I1201 09:11:08.639078 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c55c2a62-535e-4781-a610-4eeea00a871c-var-lock\") pod \"installer-9-crc\" (UID: \"c55c2a62-535e-4781-a610-4eeea00a871c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:11:08 crc kubenswrapper[4867]: I1201 09:11:08.740637 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c55c2a62-535e-4781-a610-4eeea00a871c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c55c2a62-535e-4781-a610-4eeea00a871c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:11:08 crc kubenswrapper[4867]: I1201 09:11:08.740692 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c55c2a62-535e-4781-a610-4eeea00a871c-kube-api-access\") pod \"installer-9-crc\" (UID: \"c55c2a62-535e-4781-a610-4eeea00a871c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:11:08 crc kubenswrapper[4867]: I1201 09:11:08.740717 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c55c2a62-535e-4781-a610-4eeea00a871c-var-lock\") pod \"installer-9-crc\" (UID: \"c55c2a62-535e-4781-a610-4eeea00a871c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:11:08 crc kubenswrapper[4867]: I1201 09:11:08.740858 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c55c2a62-535e-4781-a610-4eeea00a871c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c55c2a62-535e-4781-a610-4eeea00a871c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:11:08 crc kubenswrapper[4867]: I1201 09:11:08.740922 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c55c2a62-535e-4781-a610-4eeea00a871c-var-lock\") pod \"installer-9-crc\" (UID: \"c55c2a62-535e-4781-a610-4eeea00a871c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:11:08 crc kubenswrapper[4867]: I1201 09:11:08.760789 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c55c2a62-535e-4781-a610-4eeea00a871c-kube-api-access\") pod \"installer-9-crc\" (UID: \"c55c2a62-535e-4781-a610-4eeea00a871c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:11:08 crc kubenswrapper[4867]: I1201 09:11:08.896746 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:11:13 crc kubenswrapper[4867]: E1201 09:11:13.596710 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-kfrqj" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" Dec 01 09:11:14 crc kubenswrapper[4867]: E1201 09:11:14.140688 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 01 09:11:14 crc kubenswrapper[4867]: E1201 09:11:14.141260 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vkgmj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-znqxw_openshift-marketplace(d6f7e7b3-ad0a-41ab-8291-c5200fe31a88): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 09:11:14 crc kubenswrapper[4867]: E1201 09:11:14.144578 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-znqxw" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" Dec 01 09:11:14 crc kubenswrapper[4867]: E1201 09:11:14.144694 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 01 09:11:14 crc kubenswrapper[4867]: E1201 09:11:14.144943 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fsjmq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-pwxb4_openshift-marketplace(8d9f8ccf-fbe9-42b0-84e5-b1913365a11f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 09:11:14 crc kubenswrapper[4867]: E1201 09:11:14.146187 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-pwxb4" podUID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" Dec 01 09:11:14 crc kubenswrapper[4867]: I1201 09:11:14.198450 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n7wvd"] Dec 01 09:11:14 crc kubenswrapper[4867]: W1201 09:11:14.205203 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3ff1be1_b98b_483b_83ca_eb2255f66c7c.slice/crio-aca50f9c16690a372b0e0533e2324a76ac64ea985957c14c4c0400104f059ddc WatchSource:0}: Error finding container aca50f9c16690a372b0e0533e2324a76ac64ea985957c14c4c0400104f059ddc: Status 404 returned error can't find the container with id aca50f9c16690a372b0e0533e2324a76ac64ea985957c14c4c0400104f059ddc Dec 01 09:11:14 crc kubenswrapper[4867]: E1201 09:11:14.222568 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 01 09:11:14 crc kubenswrapper[4867]: E1201 09:11:14.222744 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ghmkb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zcq6l_openshift-marketplace(c347bb68-7140-4b8d-ae43-ee55b581c961): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 09:11:14 crc kubenswrapper[4867]: E1201 09:11:14.224845 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zcq6l" podUID="c347bb68-7140-4b8d-ae43-ee55b581c961" Dec 01 09:11:14 crc kubenswrapper[4867]: I1201 09:11:14.263189 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 09:11:14 crc kubenswrapper[4867]: W1201 09:11:14.271925 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod38b87dcb_a910_4b50_b784_e5c8fa7c4669.slice/crio-1f2090fbbb7c38763aaa784dc575f8f5c80bc20f773230a214f3b9fd7fb6b05a WatchSource:0}: Error finding container 1f2090fbbb7c38763aaa784dc575f8f5c80bc20f773230a214f3b9fd7fb6b05a: Status 404 returned error can't find the container with id 1f2090fbbb7c38763aaa784dc575f8f5c80bc20f773230a214f3b9fd7fb6b05a Dec 01 09:11:14 crc kubenswrapper[4867]: I1201 09:11:14.303428 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 09:11:14 crc kubenswrapper[4867]: I1201 09:11:14.921899 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c55c2a62-535e-4781-a610-4eeea00a871c","Type":"ContainerStarted","Data":"d42414f89d26f144c34efca909ffc2678ccb4f962e3f2bfb75a834897c50fb21"} Dec 01 09:11:14 crc kubenswrapper[4867]: I1201 09:11:14.922189 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c55c2a62-535e-4781-a610-4eeea00a871c","Type":"ContainerStarted","Data":"a076593fd179714b02857c8dc750f5b308aff8f49017d913c5ed7c1ba73141fc"} Dec 01 09:11:14 crc kubenswrapper[4867]: I1201 09:11:14.931379 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"38b87dcb-a910-4b50-b784-e5c8fa7c4669","Type":"ContainerStarted","Data":"069c443b3a1c6cf8ee31865835b3ae8225e57f54cc81884eadc1c56b5b8bac9f"} Dec 01 09:11:14 crc kubenswrapper[4867]: I1201 09:11:14.931417 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"38b87dcb-a910-4b50-b784-e5c8fa7c4669","Type":"ContainerStarted","Data":"1f2090fbbb7c38763aaa784dc575f8f5c80bc20f773230a214f3b9fd7fb6b05a"} Dec 01 09:11:14 crc kubenswrapper[4867]: I1201 09:11:14.934614 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n7wvd" event={"ID":"c3ff1be1-b98b-483b-83ca-eb2255f66c7c","Type":"ContainerStarted","Data":"d93fdeb42b519a2028977ae017522bb0921bfa5bf2cd6376bcc017a29fb25e39"} Dec 01 09:11:14 crc kubenswrapper[4867]: E1201 09:11:14.935307 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-pwxb4" podUID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" Dec 01 09:11:14 crc kubenswrapper[4867]: I1201 09:11:14.935354 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n7wvd" event={"ID":"c3ff1be1-b98b-483b-83ca-eb2255f66c7c","Type":"ContainerStarted","Data":"aca50f9c16690a372b0e0533e2324a76ac64ea985957c14c4c0400104f059ddc"} Dec 01 09:11:14 crc kubenswrapper[4867]: E1201 09:11:14.936297 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zcq6l" podUID="c347bb68-7140-4b8d-ae43-ee55b581c961" Dec 01 09:11:14 crc kubenswrapper[4867]: E1201 09:11:14.936375 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-znqxw" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" Dec 01 09:11:14 crc kubenswrapper[4867]: I1201 09:11:14.947558 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=6.947529281 podStartE2EDuration="6.947529281s" podCreationTimestamp="2025-12-01 09:11:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:11:14.941245667 +0000 UTC m=+196.400632421" watchObservedRunningTime="2025-12-01 09:11:14.947529281 +0000 UTC m=+196.406916035" Dec 01 09:11:15 crc kubenswrapper[4867]: I1201 09:11:14.995300 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=10.995283161 podStartE2EDuration="10.995283161s" podCreationTimestamp="2025-12-01 09:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:11:14.995010093 +0000 UTC m=+196.454396847" watchObservedRunningTime="2025-12-01 09:11:14.995283161 +0000 UTC m=+196.454669915" Dec 01 09:11:15 crc kubenswrapper[4867]: I1201 09:11:15.939085 4867 generic.go:334] "Generic (PLEG): container finished" podID="38b87dcb-a910-4b50-b784-e5c8fa7c4669" containerID="069c443b3a1c6cf8ee31865835b3ae8225e57f54cc81884eadc1c56b5b8bac9f" exitCode=0 Dec 01 09:11:15 crc kubenswrapper[4867]: I1201 09:11:15.939260 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"38b87dcb-a910-4b50-b784-e5c8fa7c4669","Type":"ContainerDied","Data":"069c443b3a1c6cf8ee31865835b3ae8225e57f54cc81884eadc1c56b5b8bac9f"} Dec 01 09:11:15 crc kubenswrapper[4867]: I1201 09:11:15.941260 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n7wvd" event={"ID":"c3ff1be1-b98b-483b-83ca-eb2255f66c7c","Type":"ContainerStarted","Data":"62736a7673eb277b64e9bb19b94c620fc2eec0c0d2d0cb1be938b932145d0a92"} Dec 01 09:11:18 crc kubenswrapper[4867]: I1201 09:11:18.780486 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 09:11:18 crc kubenswrapper[4867]: I1201 09:11:18.799398 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-n7wvd" podStartSLOduration=180.799380422 podStartE2EDuration="3m0.799380422s" podCreationTimestamp="2025-12-01 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:11:15.974925201 +0000 UTC m=+197.434312015" watchObservedRunningTime="2025-12-01 09:11:18.799380422 +0000 UTC m=+200.258767176" Dec 01 09:11:18 crc kubenswrapper[4867]: I1201 09:11:18.887574 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38b87dcb-a910-4b50-b784-e5c8fa7c4669-kubelet-dir\") pod \"38b87dcb-a910-4b50-b784-e5c8fa7c4669\" (UID: \"38b87dcb-a910-4b50-b784-e5c8fa7c4669\") " Dec 01 09:11:18 crc kubenswrapper[4867]: I1201 09:11:18.887708 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38b87dcb-a910-4b50-b784-e5c8fa7c4669-kube-api-access\") pod \"38b87dcb-a910-4b50-b784-e5c8fa7c4669\" (UID: \"38b87dcb-a910-4b50-b784-e5c8fa7c4669\") " Dec 01 09:11:18 crc kubenswrapper[4867]: I1201 09:11:18.888965 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38b87dcb-a910-4b50-b784-e5c8fa7c4669-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "38b87dcb-a910-4b50-b784-e5c8fa7c4669" (UID: "38b87dcb-a910-4b50-b784-e5c8fa7c4669"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:11:18 crc kubenswrapper[4867]: I1201 09:11:18.897924 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38b87dcb-a910-4b50-b784-e5c8fa7c4669-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "38b87dcb-a910-4b50-b784-e5c8fa7c4669" (UID: "38b87dcb-a910-4b50-b784-e5c8fa7c4669"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:11:18 crc kubenswrapper[4867]: I1201 09:11:18.964289 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"38b87dcb-a910-4b50-b784-e5c8fa7c4669","Type":"ContainerDied","Data":"1f2090fbbb7c38763aaa784dc575f8f5c80bc20f773230a214f3b9fd7fb6b05a"} Dec 01 09:11:18 crc kubenswrapper[4867]: I1201 09:11:18.964329 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f2090fbbb7c38763aaa784dc575f8f5c80bc20f773230a214f3b9fd7fb6b05a" Dec 01 09:11:18 crc kubenswrapper[4867]: I1201 09:11:18.964418 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 09:11:18 crc kubenswrapper[4867]: I1201 09:11:18.989673 4867 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38b87dcb-a910-4b50-b784-e5c8fa7c4669-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:18 crc kubenswrapper[4867]: I1201 09:11:18.989720 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38b87dcb-a910-4b50-b784-e5c8fa7c4669-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:21 crc kubenswrapper[4867]: I1201 09:11:21.601670 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:11:21 crc kubenswrapper[4867]: I1201 09:11:21.602309 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:11:21 crc kubenswrapper[4867]: I1201 09:11:21.602365 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:11:21 crc kubenswrapper[4867]: I1201 09:11:21.603107 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be"} pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:11:21 crc kubenswrapper[4867]: I1201 09:11:21.603214 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" containerID="cri-o://a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be" gracePeriod=600 Dec 01 09:11:21 crc kubenswrapper[4867]: I1201 09:11:21.981147 4867 generic.go:334] "Generic (PLEG): container finished" podID="545d34e2-c5a8-48ab-9603-9ae4986ab739" containerID="a30ffcf8b6e679e76e073d49fb72d6374e3447d0d5227e6d0ab0709d8d4ab15c" exitCode=0 Dec 01 09:11:21 crc kubenswrapper[4867]: I1201 09:11:21.981220 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nv8tc" event={"ID":"545d34e2-c5a8-48ab-9603-9ae4986ab739","Type":"ContainerDied","Data":"a30ffcf8b6e679e76e073d49fb72d6374e3447d0d5227e6d0ab0709d8d4ab15c"} Dec 01 09:11:21 crc kubenswrapper[4867]: I1201 09:11:21.982850 4867 generic.go:334] "Generic (PLEG): container finished" podID="e2e7d46b-70cc-4b22-a041-3015e8b38f37" containerID="3594abe4cb11cd306de42f17900f0c5a2feb3db8ca3c3ea2da33054ff36caa7e" exitCode=0 Dec 01 09:11:21 crc kubenswrapper[4867]: I1201 09:11:21.982870 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsrqd" event={"ID":"e2e7d46b-70cc-4b22-a041-3015e8b38f37","Type":"ContainerDied","Data":"3594abe4cb11cd306de42f17900f0c5a2feb3db8ca3c3ea2da33054ff36caa7e"} Dec 01 09:11:22 crc kubenswrapper[4867]: I1201 09:11:22.993019 4867 generic.go:334] "Generic (PLEG): container finished" podID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerID="a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be" exitCode=0 Dec 01 09:11:22 crc kubenswrapper[4867]: I1201 09:11:22.993093 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerDied","Data":"a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be"} Dec 01 09:11:24 crc kubenswrapper[4867]: I1201 09:11:23.999874 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"b87fb643a4876e9b1ddead390c05ce1df38b99a37c0239fe24fe120587d956a8"} Dec 01 09:11:24 crc kubenswrapper[4867]: I1201 09:11:24.001758 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nv8tc" event={"ID":"545d34e2-c5a8-48ab-9603-9ae4986ab739","Type":"ContainerStarted","Data":"29956ba0090af6092eaa556a0025578e62ca428457185a659067375b7932461a"} Dec 01 09:11:24 crc kubenswrapper[4867]: I1201 09:11:24.003924 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsrqd" event={"ID":"e2e7d46b-70cc-4b22-a041-3015e8b38f37","Type":"ContainerStarted","Data":"645996501d1d8ea873ec1a1bbac5d1c8de29eaffc788d950d2fa83d7310b9cd0"} Dec 01 09:11:24 crc kubenswrapper[4867]: I1201 09:11:24.007710 4867 generic.go:334] "Generic (PLEG): container finished" podID="4702840c-d6fa-4dcd-bd95-4ac89f95d727" containerID="6d0646eb99c6744aa5bd47a62522e6e81f0d6e2ba6cae3077039620e8d829f87" exitCode=0 Dec 01 09:11:24 crc kubenswrapper[4867]: I1201 09:11:24.007745 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wrtl" event={"ID":"4702840c-d6fa-4dcd-bd95-4ac89f95d727","Type":"ContainerDied","Data":"6d0646eb99c6744aa5bd47a62522e6e81f0d6e2ba6cae3077039620e8d829f87"} Dec 01 09:11:24 crc kubenswrapper[4867]: I1201 09:11:24.010581 4867 generic.go:334] "Generic (PLEG): container finished" podID="736579d0-4c1c-4766-abbd-4d681839bd0b" containerID="245619ba41ce163a92d6a85861de06b73705b796c29e9091badbce17a4f70b2e" exitCode=0 Dec 01 09:11:24 crc kubenswrapper[4867]: I1201 09:11:24.010611 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zr64t" event={"ID":"736579d0-4c1c-4766-abbd-4d681839bd0b","Type":"ContainerDied","Data":"245619ba41ce163a92d6a85861de06b73705b796c29e9091badbce17a4f70b2e"} Dec 01 09:11:24 crc kubenswrapper[4867]: I1201 09:11:24.067006 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nv8tc" podStartSLOduration=3.677904671 podStartE2EDuration="1m0.066986298s" podCreationTimestamp="2025-12-01 09:10:24 +0000 UTC" firstStartedPulling="2025-12-01 09:10:27.151064041 +0000 UTC m=+148.610450795" lastFinishedPulling="2025-12-01 09:11:23.540145668 +0000 UTC m=+204.999532422" observedRunningTime="2025-12-01 09:11:24.065925227 +0000 UTC m=+205.525311981" watchObservedRunningTime="2025-12-01 09:11:24.066986298 +0000 UTC m=+205.526373042" Dec 01 09:11:24 crc kubenswrapper[4867]: I1201 09:11:24.129693 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jsrqd" podStartSLOduration=3.585617569 podStartE2EDuration="59.129675783s" podCreationTimestamp="2025-12-01 09:10:25 +0000 UTC" firstStartedPulling="2025-12-01 09:10:28.203172215 +0000 UTC m=+149.662558969" lastFinishedPulling="2025-12-01 09:11:23.747230429 +0000 UTC m=+205.206617183" observedRunningTime="2025-12-01 09:11:24.126289586 +0000 UTC m=+205.585676350" watchObservedRunningTime="2025-12-01 09:11:24.129675783 +0000 UTC m=+205.589062527" Dec 01 09:11:25 crc kubenswrapper[4867]: I1201 09:11:25.017328 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wrtl" event={"ID":"4702840c-d6fa-4dcd-bd95-4ac89f95d727","Type":"ContainerStarted","Data":"305f9637cc21bccb7f075fb3fb9e8a1c9c9912d07256a0e65d68bc36458df8f4"} Dec 01 09:11:25 crc kubenswrapper[4867]: I1201 09:11:25.048869 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5wrtl" podStartSLOduration=4.573520978 podStartE2EDuration="1m3.048849448s" podCreationTimestamp="2025-12-01 09:10:22 +0000 UTC" firstStartedPulling="2025-12-01 09:10:26.07044129 +0000 UTC m=+147.529828044" lastFinishedPulling="2025-12-01 09:11:24.54576976 +0000 UTC m=+206.005156514" observedRunningTime="2025-12-01 09:11:25.044900576 +0000 UTC m=+206.504287330" watchObservedRunningTime="2025-12-01 09:11:25.048849448 +0000 UTC m=+206.508236212" Dec 01 09:11:25 crc kubenswrapper[4867]: I1201 09:11:25.230880 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nv8tc" Dec 01 09:11:25 crc kubenswrapper[4867]: I1201 09:11:25.231263 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nv8tc" Dec 01 09:11:25 crc kubenswrapper[4867]: I1201 09:11:25.317770 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nv8tc" Dec 01 09:11:25 crc kubenswrapper[4867]: I1201 09:11:25.649122 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jsrqd" Dec 01 09:11:25 crc kubenswrapper[4867]: I1201 09:11:25.649166 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jsrqd" Dec 01 09:11:25 crc kubenswrapper[4867]: I1201 09:11:25.705653 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jsrqd" Dec 01 09:11:26 crc kubenswrapper[4867]: I1201 09:11:26.024499 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zr64t" event={"ID":"736579d0-4c1c-4766-abbd-4d681839bd0b","Type":"ContainerStarted","Data":"3f1790476abe2f79baec1bc767a95e4ac4e07b15fdf9bacfe26eca3782d424a5"} Dec 01 09:11:26 crc kubenswrapper[4867]: I1201 09:11:26.042146 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zr64t" podStartSLOduration=4.148260431 podStartE2EDuration="1m3.042125317s" podCreationTimestamp="2025-12-01 09:10:23 +0000 UTC" firstStartedPulling="2025-12-01 09:10:25.985641703 +0000 UTC m=+147.445028457" lastFinishedPulling="2025-12-01 09:11:24.879506589 +0000 UTC m=+206.338893343" observedRunningTime="2025-12-01 09:11:26.039662126 +0000 UTC m=+207.499048880" watchObservedRunningTime="2025-12-01 09:11:26.042125317 +0000 UTC m=+207.501512071" Dec 01 09:11:32 crc kubenswrapper[4867]: I1201 09:11:32.057581 4867 generic.go:334] "Generic (PLEG): container finished" podID="5a77c808-2899-42d0-95e6-72f00df5432f" containerID="bdbe5d0b60b731687241b6c5d9b4cf27ccd8234574bb80665f8fc56dec24dfa7" exitCode=0 Dec 01 09:11:32 crc kubenswrapper[4867]: I1201 09:11:32.057668 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfrqj" event={"ID":"5a77c808-2899-42d0-95e6-72f00df5432f","Type":"ContainerDied","Data":"bdbe5d0b60b731687241b6c5d9b4cf27ccd8234574bb80665f8fc56dec24dfa7"} Dec 01 09:11:32 crc kubenswrapper[4867]: I1201 09:11:32.061322 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwxb4" event={"ID":"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f","Type":"ContainerStarted","Data":"013e761cfc8bdf38b0755ec7e1d83a0633b23600e29bf8e95221bbd5983bc1fe"} Dec 01 09:11:32 crc kubenswrapper[4867]: I1201 09:11:32.064074 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcq6l" event={"ID":"c347bb68-7140-4b8d-ae43-ee55b581c961","Type":"ContainerStarted","Data":"75f9ed8a8c3a401d40aa5a13a290ee3039cfade80d33de9ccf450491d0718a38"} Dec 01 09:11:32 crc kubenswrapper[4867]: I1201 09:11:32.066620 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-znqxw" event={"ID":"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88","Type":"ContainerStarted","Data":"e37bf53feeae0109f765f211aa8ccfa5ae45fdbb92ef635f38b53feb5ed73e50"} Dec 01 09:11:32 crc kubenswrapper[4867]: I1201 09:11:32.435138 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-p989q" podUID="49436ed5-4757-4aa2-92cb-63c65928893a" containerName="oauth-openshift" containerID="cri-o://a2a757aaa994e5d76a9f3552ebd4aad6006194498e32925bdf5255331b7ceca7" gracePeriod=15 Dec 01 09:11:33 crc kubenswrapper[4867]: I1201 09:11:33.510124 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5wrtl" Dec 01 09:11:33 crc kubenswrapper[4867]: I1201 09:11:33.510247 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5wrtl" Dec 01 09:11:33 crc kubenswrapper[4867]: I1201 09:11:33.559432 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5wrtl" Dec 01 09:11:33 crc kubenswrapper[4867]: I1201 09:11:33.706508 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zr64t" Dec 01 09:11:33 crc kubenswrapper[4867]: I1201 09:11:33.706562 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zr64t" Dec 01 09:11:33 crc kubenswrapper[4867]: I1201 09:11:33.742801 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zr64t" Dec 01 09:11:34 crc kubenswrapper[4867]: I1201 09:11:34.078568 4867 generic.go:334] "Generic (PLEG): container finished" podID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" containerID="e37bf53feeae0109f765f211aa8ccfa5ae45fdbb92ef635f38b53feb5ed73e50" exitCode=0 Dec 01 09:11:34 crc kubenswrapper[4867]: I1201 09:11:34.078865 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-znqxw" event={"ID":"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88","Type":"ContainerDied","Data":"e37bf53feeae0109f765f211aa8ccfa5ae45fdbb92ef635f38b53feb5ed73e50"} Dec 01 09:11:34 crc kubenswrapper[4867]: I1201 09:11:34.081108 4867 generic.go:334] "Generic (PLEG): container finished" podID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" containerID="013e761cfc8bdf38b0755ec7e1d83a0633b23600e29bf8e95221bbd5983bc1fe" exitCode=0 Dec 01 09:11:34 crc kubenswrapper[4867]: I1201 09:11:34.081173 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwxb4" event={"ID":"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f","Type":"ContainerDied","Data":"013e761cfc8bdf38b0755ec7e1d83a0633b23600e29bf8e95221bbd5983bc1fe"} Dec 01 09:11:34 crc kubenswrapper[4867]: I1201 09:11:34.092321 4867 generic.go:334] "Generic (PLEG): container finished" podID="49436ed5-4757-4aa2-92cb-63c65928893a" containerID="a2a757aaa994e5d76a9f3552ebd4aad6006194498e32925bdf5255331b7ceca7" exitCode=0 Dec 01 09:11:34 crc kubenswrapper[4867]: I1201 09:11:34.092775 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p989q" event={"ID":"49436ed5-4757-4aa2-92cb-63c65928893a","Type":"ContainerDied","Data":"a2a757aaa994e5d76a9f3552ebd4aad6006194498e32925bdf5255331b7ceca7"} Dec 01 09:11:34 crc kubenswrapper[4867]: I1201 09:11:34.095593 4867 generic.go:334] "Generic (PLEG): container finished" podID="c347bb68-7140-4b8d-ae43-ee55b581c961" containerID="75f9ed8a8c3a401d40aa5a13a290ee3039cfade80d33de9ccf450491d0718a38" exitCode=0 Dec 01 09:11:34 crc kubenswrapper[4867]: I1201 09:11:34.095712 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcq6l" event={"ID":"c347bb68-7140-4b8d-ae43-ee55b581c961","Type":"ContainerDied","Data":"75f9ed8a8c3a401d40aa5a13a290ee3039cfade80d33de9ccf450491d0718a38"} Dec 01 09:11:34 crc kubenswrapper[4867]: I1201 09:11:34.153215 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zr64t" Dec 01 09:11:34 crc kubenswrapper[4867]: I1201 09:11:34.159897 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5wrtl" Dec 01 09:11:35 crc kubenswrapper[4867]: I1201 09:11:35.289715 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nv8tc" Dec 01 09:11:35 crc kubenswrapper[4867]: I1201 09:11:35.694161 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jsrqd" Dec 01 09:11:35 crc kubenswrapper[4867]: I1201 09:11:35.825546 4867 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-p989q container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Dec 01 09:11:35 crc kubenswrapper[4867]: I1201 09:11:35.825603 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-p989q" podUID="49436ed5-4757-4aa2-92cb-63c65928893a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.213121 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zr64t"] Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.213338 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zr64t" podUID="736579d0-4c1c-4766-abbd-4d681839bd0b" containerName="registry-server" containerID="cri-o://3f1790476abe2f79baec1bc767a95e4ac4e07b15fdf9bacfe26eca3782d424a5" gracePeriod=2 Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.652655 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.686553 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv"] Dec 01 09:11:36 crc kubenswrapper[4867]: E1201 09:11:36.686986 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38b87dcb-a910-4b50-b784-e5c8fa7c4669" containerName="pruner" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.687055 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b87dcb-a910-4b50-b784-e5c8fa7c4669" containerName="pruner" Dec 01 09:11:36 crc kubenswrapper[4867]: E1201 09:11:36.687146 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49436ed5-4757-4aa2-92cb-63c65928893a" containerName="oauth-openshift" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.687205 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="49436ed5-4757-4aa2-92cb-63c65928893a" containerName="oauth-openshift" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.687347 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="38b87dcb-a910-4b50-b784-e5c8fa7c4669" containerName="pruner" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.687410 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="49436ed5-4757-4aa2-92cb-63c65928893a" containerName="oauth-openshift" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.687848 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.709314 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv"] Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.719772 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-session\") pod \"49436ed5-4757-4aa2-92cb-63c65928893a\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720037 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-trusted-ca-bundle\") pod \"49436ed5-4757-4aa2-92cb-63c65928893a\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720073 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-serving-cert\") pod \"49436ed5-4757-4aa2-92cb-63c65928893a\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720114 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-template-login\") pod \"49436ed5-4757-4aa2-92cb-63c65928893a\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720166 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-idp-0-file-data\") pod \"49436ed5-4757-4aa2-92cb-63c65928893a\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720201 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-cliconfig\") pod \"49436ed5-4757-4aa2-92cb-63c65928893a\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720236 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-template-error\") pod \"49436ed5-4757-4aa2-92cb-63c65928893a\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720267 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-audit-policies\") pod \"49436ed5-4757-4aa2-92cb-63c65928893a\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720326 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-service-ca\") pod \"49436ed5-4757-4aa2-92cb-63c65928893a\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720356 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-template-provider-selection\") pod \"49436ed5-4757-4aa2-92cb-63c65928893a\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720385 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49436ed5-4757-4aa2-92cb-63c65928893a-audit-dir\") pod \"49436ed5-4757-4aa2-92cb-63c65928893a\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720411 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-router-certs\") pod \"49436ed5-4757-4aa2-92cb-63c65928893a\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720460 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnv9l\" (UniqueName: \"kubernetes.io/projected/49436ed5-4757-4aa2-92cb-63c65928893a-kube-api-access-hnv9l\") pod \"49436ed5-4757-4aa2-92cb-63c65928893a\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720493 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-ocp-branding-template\") pod \"49436ed5-4757-4aa2-92cb-63c65928893a\" (UID: \"49436ed5-4757-4aa2-92cb-63c65928893a\") " Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720708 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720752 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720776 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gptk\" (UniqueName: \"kubernetes.io/projected/41e5a805-bce8-460f-9ece-a2fa98cbb62c-kube-api-access-8gptk\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720802 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720883 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-user-template-error\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720915 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/41e5a805-bce8-460f-9ece-a2fa98cbb62c-audit-dir\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720943 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-user-template-login\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.720975 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/41e5a805-bce8-460f-9ece-a2fa98cbb62c-audit-policies\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.721018 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.721050 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.721087 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.721112 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.721146 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-session\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.721171 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.724536 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49436ed5-4757-4aa2-92cb-63c65928893a" (UID: "49436ed5-4757-4aa2-92cb-63c65928893a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.726136 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49436ed5-4757-4aa2-92cb-63c65928893a" (UID: "49436ed5-4757-4aa2-92cb-63c65928893a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.726653 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49436ed5-4757-4aa2-92cb-63c65928893a" (UID: "49436ed5-4757-4aa2-92cb-63c65928893a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.727349 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49436ed5-4757-4aa2-92cb-63c65928893a" (UID: "49436ed5-4757-4aa2-92cb-63c65928893a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.727409 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49436ed5-4757-4aa2-92cb-63c65928893a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "49436ed5-4757-4aa2-92cb-63c65928893a" (UID: "49436ed5-4757-4aa2-92cb-63c65928893a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.729456 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49436ed5-4757-4aa2-92cb-63c65928893a" (UID: "49436ed5-4757-4aa2-92cb-63c65928893a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.730038 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49436ed5-4757-4aa2-92cb-63c65928893a" (UID: "49436ed5-4757-4aa2-92cb-63c65928893a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.731349 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49436ed5-4757-4aa2-92cb-63c65928893a" (UID: "49436ed5-4757-4aa2-92cb-63c65928893a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.731700 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49436ed5-4757-4aa2-92cb-63c65928893a" (UID: "49436ed5-4757-4aa2-92cb-63c65928893a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.732059 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49436ed5-4757-4aa2-92cb-63c65928893a" (UID: "49436ed5-4757-4aa2-92cb-63c65928893a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.732651 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49436ed5-4757-4aa2-92cb-63c65928893a" (UID: "49436ed5-4757-4aa2-92cb-63c65928893a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.734115 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49436ed5-4757-4aa2-92cb-63c65928893a-kube-api-access-hnv9l" (OuterVolumeSpecName: "kube-api-access-hnv9l") pod "49436ed5-4757-4aa2-92cb-63c65928893a" (UID: "49436ed5-4757-4aa2-92cb-63c65928893a"). InnerVolumeSpecName "kube-api-access-hnv9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.737418 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49436ed5-4757-4aa2-92cb-63c65928893a" (UID: "49436ed5-4757-4aa2-92cb-63c65928893a"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.742946 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49436ed5-4757-4aa2-92cb-63c65928893a" (UID: "49436ed5-4757-4aa2-92cb-63c65928893a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.822853 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.822908 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.822925 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.822944 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gptk\" (UniqueName: \"kubernetes.io/projected/41e5a805-bce8-460f-9ece-a2fa98cbb62c-kube-api-access-8gptk\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.822982 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-user-template-error\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823047 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/41e5a805-bce8-460f-9ece-a2fa98cbb62c-audit-dir\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823095 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-user-template-login\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823117 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/41e5a805-bce8-460f-9ece-a2fa98cbb62c-audit-policies\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823143 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823163 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823189 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823204 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823222 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-session\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823236 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823275 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnv9l\" (UniqueName: \"kubernetes.io/projected/49436ed5-4757-4aa2-92cb-63c65928893a-kube-api-access-hnv9l\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823287 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823297 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823307 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823317 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823326 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823336 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823344 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823353 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823361 4867 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823373 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823384 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823393 4867 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49436ed5-4757-4aa2-92cb-63c65928893a-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823401 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49436ed5-4757-4aa2-92cb-63c65928893a-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.823879 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/41e5a805-bce8-460f-9ece-a2fa98cbb62c-audit-dir\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.824185 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.824674 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.825168 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.827931 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:36 crc kubenswrapper[4867]: I1201 09:11:36.828731 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:37 crc kubenswrapper[4867]: I1201 09:11:37.111371 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p989q" event={"ID":"49436ed5-4757-4aa2-92cb-63c65928893a","Type":"ContainerDied","Data":"1f844ff10a7ef19edfe609b1c457b6b67e4b87487df4208d5f3756048a4bd8be"} Dec 01 09:11:37 crc kubenswrapper[4867]: I1201 09:11:37.111450 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p989q" Dec 01 09:11:37 crc kubenswrapper[4867]: I1201 09:11:37.111454 4867 scope.go:117] "RemoveContainer" containerID="a2a757aaa994e5d76a9f3552ebd4aad6006194498e32925bdf5255331b7ceca7" Dec 01 09:11:37 crc kubenswrapper[4867]: I1201 09:11:37.134932 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p989q"] Dec 01 09:11:37 crc kubenswrapper[4867]: I1201 09:11:37.143656 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p989q"] Dec 01 09:11:38 crc kubenswrapper[4867]: I1201 09:11:38.488651 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:38 crc kubenswrapper[4867]: I1201 09:11:38.488713 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-user-template-error\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:38 crc kubenswrapper[4867]: I1201 09:11:38.488721 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:38 crc kubenswrapper[4867]: I1201 09:11:38.488797 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-system-session\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:38 crc kubenswrapper[4867]: I1201 09:11:38.489274 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-user-template-login\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:38 crc kubenswrapper[4867]: I1201 09:11:38.491303 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/41e5a805-bce8-460f-9ece-a2fa98cbb62c-audit-policies\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:38 crc kubenswrapper[4867]: I1201 09:11:38.494624 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gptk\" (UniqueName: \"kubernetes.io/projected/41e5a805-bce8-460f-9ece-a2fa98cbb62c-kube-api-access-8gptk\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:38 crc kubenswrapper[4867]: I1201 09:11:38.499616 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/41e5a805-bce8-460f-9ece-a2fa98cbb62c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cc8b7fbff-fm9tv\" (UID: \"41e5a805-bce8-460f-9ece-a2fa98cbb62c\") " pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:38 crc kubenswrapper[4867]: I1201 09:11:38.511715 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:38 crc kubenswrapper[4867]: I1201 09:11:38.616574 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jsrqd"] Dec 01 09:11:38 crc kubenswrapper[4867]: I1201 09:11:38.616931 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jsrqd" podUID="e2e7d46b-70cc-4b22-a041-3015e8b38f37" containerName="registry-server" containerID="cri-o://645996501d1d8ea873ec1a1bbac5d1c8de29eaffc788d950d2fa83d7310b9cd0" gracePeriod=2 Dec 01 09:11:38 crc kubenswrapper[4867]: I1201 09:11:38.834754 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49436ed5-4757-4aa2-92cb-63c65928893a" path="/var/lib/kubelet/pods/49436ed5-4757-4aa2-92cb-63c65928893a/volumes" Dec 01 09:11:38 crc kubenswrapper[4867]: I1201 09:11:38.966612 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv"] Dec 01 09:11:39 crc kubenswrapper[4867]: I1201 09:11:39.124011 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" event={"ID":"41e5a805-bce8-460f-9ece-a2fa98cbb62c","Type":"ContainerStarted","Data":"8e4ad0045fe83087cc1e56a8a7d9b5a70ff1db18cfe0748f8239491f8c06199c"} Dec 01 09:11:42 crc kubenswrapper[4867]: I1201 09:11:42.282314 4867 generic.go:334] "Generic (PLEG): container finished" podID="736579d0-4c1c-4766-abbd-4d681839bd0b" containerID="3f1790476abe2f79baec1bc767a95e4ac4e07b15fdf9bacfe26eca3782d424a5" exitCode=0 Dec 01 09:11:42 crc kubenswrapper[4867]: I1201 09:11:42.282510 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zr64t" event={"ID":"736579d0-4c1c-4766-abbd-4d681839bd0b","Type":"ContainerDied","Data":"3f1790476abe2f79baec1bc767a95e4ac4e07b15fdf9bacfe26eca3782d424a5"} Dec 01 09:11:43 crc kubenswrapper[4867]: E1201 09:11:43.707618 4867 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3f1790476abe2f79baec1bc767a95e4ac4e07b15fdf9bacfe26eca3782d424a5 is running failed: container process not found" containerID="3f1790476abe2f79baec1bc767a95e4ac4e07b15fdf9bacfe26eca3782d424a5" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 09:11:43 crc kubenswrapper[4867]: E1201 09:11:43.708357 4867 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3f1790476abe2f79baec1bc767a95e4ac4e07b15fdf9bacfe26eca3782d424a5 is running failed: container process not found" containerID="3f1790476abe2f79baec1bc767a95e4ac4e07b15fdf9bacfe26eca3782d424a5" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 09:11:43 crc kubenswrapper[4867]: E1201 09:11:43.708939 4867 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3f1790476abe2f79baec1bc767a95e4ac4e07b15fdf9bacfe26eca3782d424a5 is running failed: container process not found" containerID="3f1790476abe2f79baec1bc767a95e4ac4e07b15fdf9bacfe26eca3782d424a5" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 09:11:43 crc kubenswrapper[4867]: E1201 09:11:43.709001 4867 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3f1790476abe2f79baec1bc767a95e4ac4e07b15fdf9bacfe26eca3782d424a5 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-zr64t" podUID="736579d0-4c1c-4766-abbd-4d681839bd0b" containerName="registry-server" Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.241520 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zr64t" Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.333591 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2pfj\" (UniqueName: \"kubernetes.io/projected/736579d0-4c1c-4766-abbd-4d681839bd0b-kube-api-access-r2pfj\") pod \"736579d0-4c1c-4766-abbd-4d681839bd0b\" (UID: \"736579d0-4c1c-4766-abbd-4d681839bd0b\") " Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.333644 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/736579d0-4c1c-4766-abbd-4d681839bd0b-catalog-content\") pod \"736579d0-4c1c-4766-abbd-4d681839bd0b\" (UID: \"736579d0-4c1c-4766-abbd-4d681839bd0b\") " Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.334639 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/736579d0-4c1c-4766-abbd-4d681839bd0b-utilities" (OuterVolumeSpecName: "utilities") pod "736579d0-4c1c-4766-abbd-4d681839bd0b" (UID: "736579d0-4c1c-4766-abbd-4d681839bd0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.333671 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/736579d0-4c1c-4766-abbd-4d681839bd0b-utilities\") pod \"736579d0-4c1c-4766-abbd-4d681839bd0b\" (UID: \"736579d0-4c1c-4766-abbd-4d681839bd0b\") " Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.341373 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/736579d0-4c1c-4766-abbd-4d681839bd0b-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.345059 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/736579d0-4c1c-4766-abbd-4d681839bd0b-kube-api-access-r2pfj" (OuterVolumeSpecName: "kube-api-access-r2pfj") pod "736579d0-4c1c-4766-abbd-4d681839bd0b" (UID: "736579d0-4c1c-4766-abbd-4d681839bd0b"). InnerVolumeSpecName "kube-api-access-r2pfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.382725 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/736579d0-4c1c-4766-abbd-4d681839bd0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "736579d0-4c1c-4766-abbd-4d681839bd0b" (UID: "736579d0-4c1c-4766-abbd-4d681839bd0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.391617 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jsrqd" Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.423566 4867 generic.go:334] "Generic (PLEG): container finished" podID="e2e7d46b-70cc-4b22-a041-3015e8b38f37" containerID="645996501d1d8ea873ec1a1bbac5d1c8de29eaffc788d950d2fa83d7310b9cd0" exitCode=0 Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.423622 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsrqd" event={"ID":"e2e7d46b-70cc-4b22-a041-3015e8b38f37","Type":"ContainerDied","Data":"645996501d1d8ea873ec1a1bbac5d1c8de29eaffc788d950d2fa83d7310b9cd0"} Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.423667 4867 scope.go:117] "RemoveContainer" containerID="645996501d1d8ea873ec1a1bbac5d1c8de29eaffc788d950d2fa83d7310b9cd0" Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.442203 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e7d46b-70cc-4b22-a041-3015e8b38f37-catalog-content\") pod \"e2e7d46b-70cc-4b22-a041-3015e8b38f37\" (UID: \"e2e7d46b-70cc-4b22-a041-3015e8b38f37\") " Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.442248 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e7d46b-70cc-4b22-a041-3015e8b38f37-utilities\") pod \"e2e7d46b-70cc-4b22-a041-3015e8b38f37\" (UID: \"e2e7d46b-70cc-4b22-a041-3015e8b38f37\") " Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.442372 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njnkr\" (UniqueName: \"kubernetes.io/projected/e2e7d46b-70cc-4b22-a041-3015e8b38f37-kube-api-access-njnkr\") pod \"e2e7d46b-70cc-4b22-a041-3015e8b38f37\" (UID: \"e2e7d46b-70cc-4b22-a041-3015e8b38f37\") " Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.442572 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2pfj\" (UniqueName: \"kubernetes.io/projected/736579d0-4c1c-4766-abbd-4d681839bd0b-kube-api-access-r2pfj\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.442588 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/736579d0-4c1c-4766-abbd-4d681839bd0b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.443059 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2e7d46b-70cc-4b22-a041-3015e8b38f37-utilities" (OuterVolumeSpecName: "utilities") pod "e2e7d46b-70cc-4b22-a041-3015e8b38f37" (UID: "e2e7d46b-70cc-4b22-a041-3015e8b38f37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.456494 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2e7d46b-70cc-4b22-a041-3015e8b38f37-kube-api-access-njnkr" (OuterVolumeSpecName: "kube-api-access-njnkr") pod "e2e7d46b-70cc-4b22-a041-3015e8b38f37" (UID: "e2e7d46b-70cc-4b22-a041-3015e8b38f37"). InnerVolumeSpecName "kube-api-access-njnkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.460161 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2e7d46b-70cc-4b22-a041-3015e8b38f37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2e7d46b-70cc-4b22-a041-3015e8b38f37" (UID: "e2e7d46b-70cc-4b22-a041-3015e8b38f37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.544012 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e7d46b-70cc-4b22-a041-3015e8b38f37-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.544056 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e7d46b-70cc-4b22-a041-3015e8b38f37-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:44 crc kubenswrapper[4867]: I1201 09:11:44.544066 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njnkr\" (UniqueName: \"kubernetes.io/projected/e2e7d46b-70cc-4b22-a041-3015e8b38f37-kube-api-access-njnkr\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:45 crc kubenswrapper[4867]: I1201 09:11:45.431328 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zr64t" event={"ID":"736579d0-4c1c-4766-abbd-4d681839bd0b","Type":"ContainerDied","Data":"400e469a019b1504dcc80fcd8f3c0c491501374dbb5d62a64812181fbbf3bfa5"} Dec 01 09:11:45 crc kubenswrapper[4867]: I1201 09:11:45.431440 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zr64t" Dec 01 09:11:45 crc kubenswrapper[4867]: I1201 09:11:45.435685 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsrqd" event={"ID":"e2e7d46b-70cc-4b22-a041-3015e8b38f37","Type":"ContainerDied","Data":"42eabb49921acd18b972ed5e8ef3d1e5787f0a5da04f825b4ff79f09e8650e01"} Dec 01 09:11:45 crc kubenswrapper[4867]: I1201 09:11:45.435844 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jsrqd" Dec 01 09:11:45 crc kubenswrapper[4867]: I1201 09:11:45.440356 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" event={"ID":"41e5a805-bce8-460f-9ece-a2fa98cbb62c","Type":"ContainerStarted","Data":"8682f215a139bba6a0ec5fad4be647626ae64a5eb76edb5f58de1592f8560fc1"} Dec 01 09:11:45 crc kubenswrapper[4867]: I1201 09:11:45.440770 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:45 crc kubenswrapper[4867]: I1201 09:11:45.445748 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" Dec 01 09:11:45 crc kubenswrapper[4867]: I1201 09:11:45.450476 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zr64t"] Dec 01 09:11:45 crc kubenswrapper[4867]: I1201 09:11:45.454623 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zr64t"] Dec 01 09:11:45 crc kubenswrapper[4867]: I1201 09:11:45.459036 4867 scope.go:117] "RemoveContainer" containerID="3594abe4cb11cd306de42f17900f0c5a2feb3db8ca3c3ea2da33054ff36caa7e" Dec 01 09:11:45 crc kubenswrapper[4867]: I1201 09:11:45.501360 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6cc8b7fbff-fm9tv" podStartSLOduration=38.501342846 podStartE2EDuration="38.501342846s" podCreationTimestamp="2025-12-01 09:11:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:11:45.485921553 +0000 UTC m=+226.945308327" watchObservedRunningTime="2025-12-01 09:11:45.501342846 +0000 UTC m=+226.960729600" Dec 01 09:11:45 crc kubenswrapper[4867]: I1201 09:11:45.502677 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jsrqd"] Dec 01 09:11:45 crc kubenswrapper[4867]: I1201 09:11:45.516905 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jsrqd"] Dec 01 09:11:46 crc kubenswrapper[4867]: I1201 09:11:46.835496 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="736579d0-4c1c-4766-abbd-4d681839bd0b" path="/var/lib/kubelet/pods/736579d0-4c1c-4766-abbd-4d681839bd0b/volumes" Dec 01 09:11:46 crc kubenswrapper[4867]: I1201 09:11:46.836436 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2e7d46b-70cc-4b22-a041-3015e8b38f37" path="/var/lib/kubelet/pods/e2e7d46b-70cc-4b22-a041-3015e8b38f37/volumes" Dec 01 09:11:47 crc kubenswrapper[4867]: I1201 09:11:47.131062 4867 scope.go:117] "RemoveContainer" containerID="0ba8c9f979ddfdf17815e7484f6f7316c578a866ba658bdee309904bf7347ca1" Dec 01 09:11:47 crc kubenswrapper[4867]: I1201 09:11:47.156437 4867 scope.go:117] "RemoveContainer" containerID="3f1790476abe2f79baec1bc767a95e4ac4e07b15fdf9bacfe26eca3782d424a5" Dec 01 09:11:47 crc kubenswrapper[4867]: I1201 09:11:47.198484 4867 scope.go:117] "RemoveContainer" containerID="245619ba41ce163a92d6a85861de06b73705b796c29e9091badbce17a4f70b2e" Dec 01 09:11:47 crc kubenswrapper[4867]: I1201 09:11:47.261426 4867 scope.go:117] "RemoveContainer" containerID="b15eb9582b0724ef1efbcbd45215495b0e0b19950f2e71bb810a73b1bc10ed16" Dec 01 09:11:47 crc kubenswrapper[4867]: I1201 09:11:47.288802 4867 scope.go:117] "RemoveContainer" containerID="645996501d1d8ea873ec1a1bbac5d1c8de29eaffc788d950d2fa83d7310b9cd0" Dec 01 09:11:47 crc kubenswrapper[4867]: E1201 09:11:47.289339 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"645996501d1d8ea873ec1a1bbac5d1c8de29eaffc788d950d2fa83d7310b9cd0\": container with ID starting with 645996501d1d8ea873ec1a1bbac5d1c8de29eaffc788d950d2fa83d7310b9cd0 not found: ID does not exist" containerID="645996501d1d8ea873ec1a1bbac5d1c8de29eaffc788d950d2fa83d7310b9cd0" Dec 01 09:11:47 crc kubenswrapper[4867]: I1201 09:11:47.289467 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"645996501d1d8ea873ec1a1bbac5d1c8de29eaffc788d950d2fa83d7310b9cd0"} err="failed to get container status \"645996501d1d8ea873ec1a1bbac5d1c8de29eaffc788d950d2fa83d7310b9cd0\": rpc error: code = NotFound desc = could not find container \"645996501d1d8ea873ec1a1bbac5d1c8de29eaffc788d950d2fa83d7310b9cd0\": container with ID starting with 645996501d1d8ea873ec1a1bbac5d1c8de29eaffc788d950d2fa83d7310b9cd0 not found: ID does not exist" Dec 01 09:11:47 crc kubenswrapper[4867]: I1201 09:11:47.289577 4867 scope.go:117] "RemoveContainer" containerID="3594abe4cb11cd306de42f17900f0c5a2feb3db8ca3c3ea2da33054ff36caa7e" Dec 01 09:11:47 crc kubenswrapper[4867]: E1201 09:11:47.289983 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3594abe4cb11cd306de42f17900f0c5a2feb3db8ca3c3ea2da33054ff36caa7e\": container with ID starting with 3594abe4cb11cd306de42f17900f0c5a2feb3db8ca3c3ea2da33054ff36caa7e not found: ID does not exist" containerID="3594abe4cb11cd306de42f17900f0c5a2feb3db8ca3c3ea2da33054ff36caa7e" Dec 01 09:11:47 crc kubenswrapper[4867]: I1201 09:11:47.290109 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3594abe4cb11cd306de42f17900f0c5a2feb3db8ca3c3ea2da33054ff36caa7e"} err="failed to get container status \"3594abe4cb11cd306de42f17900f0c5a2feb3db8ca3c3ea2da33054ff36caa7e\": rpc error: code = NotFound desc = could not find container \"3594abe4cb11cd306de42f17900f0c5a2feb3db8ca3c3ea2da33054ff36caa7e\": container with ID starting with 3594abe4cb11cd306de42f17900f0c5a2feb3db8ca3c3ea2da33054ff36caa7e not found: ID does not exist" Dec 01 09:11:47 crc kubenswrapper[4867]: I1201 09:11:47.290204 4867 scope.go:117] "RemoveContainer" containerID="0ba8c9f979ddfdf17815e7484f6f7316c578a866ba658bdee309904bf7347ca1" Dec 01 09:11:47 crc kubenswrapper[4867]: E1201 09:11:47.290589 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ba8c9f979ddfdf17815e7484f6f7316c578a866ba658bdee309904bf7347ca1\": container with ID starting with 0ba8c9f979ddfdf17815e7484f6f7316c578a866ba658bdee309904bf7347ca1 not found: ID does not exist" containerID="0ba8c9f979ddfdf17815e7484f6f7316c578a866ba658bdee309904bf7347ca1" Dec 01 09:11:47 crc kubenswrapper[4867]: I1201 09:11:47.290724 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ba8c9f979ddfdf17815e7484f6f7316c578a866ba658bdee309904bf7347ca1"} err="failed to get container status \"0ba8c9f979ddfdf17815e7484f6f7316c578a866ba658bdee309904bf7347ca1\": rpc error: code = NotFound desc = could not find container \"0ba8c9f979ddfdf17815e7484f6f7316c578a866ba658bdee309904bf7347ca1\": container with ID starting with 0ba8c9f979ddfdf17815e7484f6f7316c578a866ba658bdee309904bf7347ca1 not found: ID does not exist" Dec 01 09:11:47 crc kubenswrapper[4867]: I1201 09:11:47.455254 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwxb4" event={"ID":"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f","Type":"ContainerStarted","Data":"9156f251355a6c80dd8a0a065c59e86c6aaf74000cc46e243116cbc7c06dac4a"} Dec 01 09:11:47 crc kubenswrapper[4867]: I1201 09:11:47.458266 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcq6l" event={"ID":"c347bb68-7140-4b8d-ae43-ee55b581c961","Type":"ContainerStarted","Data":"3a3583fd43edc7e45de2907c196a22858c6a71063fa347ad12de6fbc902e18fb"} Dec 01 09:11:47 crc kubenswrapper[4867]: I1201 09:11:47.460380 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-znqxw" event={"ID":"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88","Type":"ContainerStarted","Data":"66f93ba6d91139b2365b929e38ed7b88f49f8584eb586157d59ace4ca2f72b4f"} Dec 01 09:11:47 crc kubenswrapper[4867]: I1201 09:11:47.462763 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfrqj" event={"ID":"5a77c808-2899-42d0-95e6-72f00df5432f","Type":"ContainerStarted","Data":"1d2d7d5a44d1d3a7123da638c13be7ad5c902e7bc8b2c338944b5dfa2f632e8a"} Dec 01 09:11:47 crc kubenswrapper[4867]: I1201 09:11:47.475069 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pwxb4" podStartSLOduration=4.435663968 podStartE2EDuration="1m22.475054943s" podCreationTimestamp="2025-12-01 09:10:25 +0000 UTC" firstStartedPulling="2025-12-01 09:10:28.32819817 +0000 UTC m=+149.787584924" lastFinishedPulling="2025-12-01 09:11:46.367589145 +0000 UTC m=+227.826975899" observedRunningTime="2025-12-01 09:11:47.471628885 +0000 UTC m=+228.931015639" watchObservedRunningTime="2025-12-01 09:11:47.475054943 +0000 UTC m=+228.934441707" Dec 01 09:11:47 crc kubenswrapper[4867]: I1201 09:11:47.489390 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zcq6l" podStartSLOduration=2.615745718 podStartE2EDuration="1m21.489373734s" podCreationTimestamp="2025-12-01 09:10:26 +0000 UTC" firstStartedPulling="2025-12-01 09:10:28.256460577 +0000 UTC m=+149.715847331" lastFinishedPulling="2025-12-01 09:11:47.130088593 +0000 UTC m=+228.589475347" observedRunningTime="2025-12-01 09:11:47.489126956 +0000 UTC m=+228.948513710" watchObservedRunningTime="2025-12-01 09:11:47.489373734 +0000 UTC m=+228.948760488" Dec 01 09:11:47 crc kubenswrapper[4867]: I1201 09:11:47.513283 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kfrqj" podStartSLOduration=3.3082068749999998 podStartE2EDuration="1m25.513269628s" podCreationTimestamp="2025-12-01 09:10:22 +0000 UTC" firstStartedPulling="2025-12-01 09:10:24.940449132 +0000 UTC m=+146.399835886" lastFinishedPulling="2025-12-01 09:11:47.145511875 +0000 UTC m=+228.604898639" observedRunningTime="2025-12-01 09:11:47.511449065 +0000 UTC m=+228.970835819" watchObservedRunningTime="2025-12-01 09:11:47.513269628 +0000 UTC m=+228.972656382" Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.236281 4867 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.237964 4867 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.241389 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e7d46b-70cc-4b22-a041-3015e8b38f37" containerName="registry-server" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.241420 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e7d46b-70cc-4b22-a041-3015e8b38f37" containerName="registry-server" Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.241433 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736579d0-4c1c-4766-abbd-4d681839bd0b" containerName="extract-utilities" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.241443 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="736579d0-4c1c-4766-abbd-4d681839bd0b" containerName="extract-utilities" Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.241455 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e7d46b-70cc-4b22-a041-3015e8b38f37" containerName="extract-content" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.241480 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e7d46b-70cc-4b22-a041-3015e8b38f37" containerName="extract-content" Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.241490 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736579d0-4c1c-4766-abbd-4d681839bd0b" containerName="extract-content" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.241498 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="736579d0-4c1c-4766-abbd-4d681839bd0b" containerName="extract-content" Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.241516 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736579d0-4c1c-4766-abbd-4d681839bd0b" containerName="registry-server" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.241523 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="736579d0-4c1c-4766-abbd-4d681839bd0b" containerName="registry-server" Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.241533 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e7d46b-70cc-4b22-a041-3015e8b38f37" containerName="extract-utilities" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.241540 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e7d46b-70cc-4b22-a041-3015e8b38f37" containerName="extract-utilities" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.241656 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="736579d0-4c1c-4766-abbd-4d681839bd0b" containerName="registry-server" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.241672 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2e7d46b-70cc-4b22-a041-3015e8b38f37" containerName="registry-server" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.242069 4867 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.242099 4867 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.242203 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.242215 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.242223 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.242230 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.242240 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.242246 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.242253 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.242259 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.242267 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.242273 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.242282 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.242288 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.242295 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.242301 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.242381 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.242392 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.242404 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.242413 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.242422 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.242645 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.243801 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.244788 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d" gracePeriod=15 Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.245069 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71" gracePeriod=15 Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.245186 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b" gracePeriod=15 Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.245291 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a" gracePeriod=15 Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.245386 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa" gracePeriod=15 Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.262125 4867 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.283558 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-znqxw" podStartSLOduration=7.106646862 podStartE2EDuration="1m29.28353915s" podCreationTimestamp="2025-12-01 09:10:23 +0000 UTC" firstStartedPulling="2025-12-01 09:10:24.936540958 +0000 UTC m=+146.395927712" lastFinishedPulling="2025-12-01 09:11:47.113433206 +0000 UTC m=+228.572820000" observedRunningTime="2025-12-01 09:11:47.53886695 +0000 UTC m=+228.998253714" watchObservedRunningTime="2025-12-01 09:11:52.28353915 +0000 UTC m=+233.742925904" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.285708 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.450797 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.450879 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.450922 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.450955 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.451016 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.451137 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.451191 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.451219 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.552220 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.552286 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.552309 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.552325 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.552360 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.552379 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.552378 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.552410 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.552392 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.552446 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.552511 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.552529 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.552544 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.552569 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.552565 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.552608 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.570090 4867 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.570716 4867 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.571125 4867 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.571490 4867 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.571917 4867 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.571944 4867 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.572166 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="200ms" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.582196 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:11:52 crc kubenswrapper[4867]: W1201 09:11:52.610166 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-177ec4f190cd16e3c5729f0a724eeb46c685d084f77b2f58e25291aa522ad695 WatchSource:0}: Error finding container 177ec4f190cd16e3c5729f0a724eeb46c685d084f77b2f58e25291aa522ad695: Status 404 returned error can't find the container with id 177ec4f190cd16e3c5729f0a724eeb46c685d084f77b2f58e25291aa522ad695 Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.612589 4867 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.224:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d0c71f6e100c1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 09:11:52.611881153 +0000 UTC m=+234.071267907,LastTimestamp:2025-12-01 09:11:52.611881153 +0000 UTC m=+234.071267907,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 09:11:52 crc kubenswrapper[4867]: E1201 09:11:52.772765 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="400ms" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.972098 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kfrqj" Dec 01 09:11:52 crc kubenswrapper[4867]: I1201 09:11:52.972164 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kfrqj" Dec 01 09:11:53 crc kubenswrapper[4867]: I1201 09:11:53.021357 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kfrqj" Dec 01 09:11:53 crc kubenswrapper[4867]: I1201 09:11:53.021938 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:53 crc kubenswrapper[4867]: I1201 09:11:53.022265 4867 status_manager.go:851] "Failed to get status for pod" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" pod="openshift-marketplace/certified-operators-kfrqj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfrqj\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:53 crc kubenswrapper[4867]: E1201 09:11:53.174589 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="800ms" Dec 01 09:11:53 crc kubenswrapper[4867]: I1201 09:11:53.469287 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-znqxw" Dec 01 09:11:53 crc kubenswrapper[4867]: I1201 09:11:53.469367 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-znqxw" Dec 01 09:11:53 crc kubenswrapper[4867]: I1201 09:11:53.495778 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"177ec4f190cd16e3c5729f0a724eeb46c685d084f77b2f58e25291aa522ad695"} Dec 01 09:11:53 crc kubenswrapper[4867]: I1201 09:11:53.522805 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-znqxw" Dec 01 09:11:53 crc kubenswrapper[4867]: I1201 09:11:53.523369 4867 status_manager.go:851] "Failed to get status for pod" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" pod="openshift-marketplace/certified-operators-kfrqj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfrqj\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:53 crc kubenswrapper[4867]: I1201 09:11:53.523708 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:53 crc kubenswrapper[4867]: I1201 09:11:53.524028 4867 status_manager.go:851] "Failed to get status for pod" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" pod="openshift-marketplace/certified-operators-znqxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-znqxw\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:53 crc kubenswrapper[4867]: I1201 09:11:53.578333 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-znqxw" Dec 01 09:11:53 crc kubenswrapper[4867]: I1201 09:11:53.580076 4867 status_manager.go:851] "Failed to get status for pod" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" pod="openshift-marketplace/certified-operators-kfrqj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfrqj\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:53 crc kubenswrapper[4867]: I1201 09:11:53.580494 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:53 crc kubenswrapper[4867]: I1201 09:11:53.580891 4867 status_manager.go:851] "Failed to get status for pod" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" pod="openshift-marketplace/certified-operators-znqxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-znqxw\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:53 crc kubenswrapper[4867]: I1201 09:11:53.581489 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kfrqj" Dec 01 09:11:53 crc kubenswrapper[4867]: I1201 09:11:53.581878 4867 status_manager.go:851] "Failed to get status for pod" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" pod="openshift-marketplace/certified-operators-kfrqj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfrqj\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:53 crc kubenswrapper[4867]: I1201 09:11:53.582215 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:53 crc kubenswrapper[4867]: I1201 09:11:53.582606 4867 status_manager.go:851] "Failed to get status for pod" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" pod="openshift-marketplace/certified-operators-znqxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-znqxw\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:53 crc kubenswrapper[4867]: E1201 09:11:53.975348 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="1.6s" Dec 01 09:11:54 crc kubenswrapper[4867]: I1201 09:11:54.525796 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"8175543ba001df0f0808ba8c329af9dff180dd3b011a61d43e5c40948a6a5dc1"} Dec 01 09:11:54 crc kubenswrapper[4867]: I1201 09:11:54.538284 4867 generic.go:334] "Generic (PLEG): container finished" podID="c55c2a62-535e-4781-a610-4eeea00a871c" containerID="d42414f89d26f144c34efca909ffc2678ccb4f962e3f2bfb75a834897c50fb21" exitCode=0 Dec 01 09:11:54 crc kubenswrapper[4867]: I1201 09:11:54.538390 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c55c2a62-535e-4781-a610-4eeea00a871c","Type":"ContainerDied","Data":"d42414f89d26f144c34efca909ffc2678ccb4f962e3f2bfb75a834897c50fb21"} Dec 01 09:11:54 crc kubenswrapper[4867]: I1201 09:11:54.539522 4867 status_manager.go:851] "Failed to get status for pod" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" pod="openshift-marketplace/certified-operators-kfrqj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfrqj\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:54 crc kubenswrapper[4867]: I1201 09:11:54.539710 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:54 crc kubenswrapper[4867]: I1201 09:11:54.539867 4867 status_manager.go:851] "Failed to get status for pod" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" pod="openshift-marketplace/certified-operators-znqxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-znqxw\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:54 crc kubenswrapper[4867]: I1201 09:11:54.540006 4867 status_manager.go:851] "Failed to get status for pod" podUID="c55c2a62-535e-4781-a610-4eeea00a871c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:54 crc kubenswrapper[4867]: I1201 09:11:54.551279 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 09:11:54 crc kubenswrapper[4867]: I1201 09:11:54.552149 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 09:11:54 crc kubenswrapper[4867]: I1201 09:11:54.552768 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71" exitCode=0 Dec 01 09:11:54 crc kubenswrapper[4867]: I1201 09:11:54.552796 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b" exitCode=0 Dec 01 09:11:54 crc kubenswrapper[4867]: I1201 09:11:54.552804 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a" exitCode=0 Dec 01 09:11:54 crc kubenswrapper[4867]: I1201 09:11:54.552826 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa" exitCode=2 Dec 01 09:11:54 crc kubenswrapper[4867]: I1201 09:11:54.553246 4867 scope.go:117] "RemoveContainer" containerID="c6426c1ec4207dcc5666d9cc39bbd0f1c1daa7776a66953a4f0c43f6fbed5169" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.214788 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.215920 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.217614 4867 status_manager.go:851] "Failed to get status for pod" podUID="c55c2a62-535e-4781-a610-4eeea00a871c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.219498 4867 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.219785 4867 status_manager.go:851] "Failed to get status for pod" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" pod="openshift-marketplace/certified-operators-kfrqj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfrqj\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.220094 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.220371 4867 status_manager.go:851] "Failed to get status for pod" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" pod="openshift-marketplace/certified-operators-znqxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-znqxw\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.337135 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.337232 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.337288 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.337334 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.337334 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.337397 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.337586 4867 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.337603 4867 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.337612 4867 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.561572 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.562421 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d" exitCode=0 Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.562519 4867 scope.go:117] "RemoveContainer" containerID="73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.562556 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.565995 4867 status_manager.go:851] "Failed to get status for pod" podUID="c55c2a62-535e-4781-a610-4eeea00a871c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.566495 4867 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.566677 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.566840 4867 status_manager.go:851] "Failed to get status for pod" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" pod="openshift-marketplace/certified-operators-kfrqj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfrqj\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.567044 4867 status_manager.go:851] "Failed to get status for pod" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" pod="openshift-marketplace/certified-operators-znqxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-znqxw\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: E1201 09:11:55.576490 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="3.2s" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.579303 4867 status_manager.go:851] "Failed to get status for pod" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" pod="openshift-marketplace/certified-operators-kfrqj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfrqj\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.579649 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.579985 4867 status_manager.go:851] "Failed to get status for pod" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" pod="openshift-marketplace/certified-operators-znqxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-znqxw\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.580202 4867 status_manager.go:851] "Failed to get status for pod" podUID="c55c2a62-535e-4781-a610-4eeea00a871c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.580450 4867 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: E1201 09:11:55.588962 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:11:55Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:11:55Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:11:55Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:11:55Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:20434c856c20158a4c73986bf7de93188afa338ed356d293a59f9e621072cfc3\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:24f7dab5f4a6fcbb16d41b8a7345f9f9bae2ef1e2c53abed71c4f18eeafebc85\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1605131077},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:d0d3fce260dd5b7e90c22ae8dc5f447b01e6bb9e798d0bef999ca7abcdc664c0\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:e2569f586510e07a900470ff7716df01d9a339a305ce9148d93e2a2d7a4cafe8\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1202665305},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:630899c24169faa9a7fc626551082a1681089cb03fd59ddb5fe3a5f1c01502c2\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:6ef63256e3b067a9a4a64a3243538f4274785f4209e1b0afb57bc3d784c20fad\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201245003},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ebe9f523f5c211a3a0f2570331dddcd5be15b12c1fecd9b8b121f881bfaad029\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: E1201 09:11:55.589391 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: E1201 09:11:55.589599 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: E1201 09:11:55.589772 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: E1201 09:11:55.589962 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: E1201 09:11:55.589979 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.593204 4867 scope.go:117] "RemoveContainer" containerID="2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.619576 4867 scope.go:117] "RemoveContainer" containerID="f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.646041 4867 scope.go:117] "RemoveContainer" containerID="b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.666611 4867 scope.go:117] "RemoveContainer" containerID="4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.682588 4867 scope.go:117] "RemoveContainer" containerID="453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.730393 4867 scope.go:117] "RemoveContainer" containerID="73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71" Dec 01 09:11:55 crc kubenswrapper[4867]: E1201 09:11:55.732416 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\": container with ID starting with 73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71 not found: ID does not exist" containerID="73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.732448 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71"} err="failed to get container status \"73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\": rpc error: code = NotFound desc = could not find container \"73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71\": container with ID starting with 73c989386427c07f25c9bf8cdc80b0a6c8f090c280061b8870a6bff714a81b71 not found: ID does not exist" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.732494 4867 scope.go:117] "RemoveContainer" containerID="2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b" Dec 01 09:11:55 crc kubenswrapper[4867]: E1201 09:11:55.733254 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\": container with ID starting with 2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b not found: ID does not exist" containerID="2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.733309 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b"} err="failed to get container status \"2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\": rpc error: code = NotFound desc = could not find container \"2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b\": container with ID starting with 2c6d4534de912b6b8a0a3b82626b1ce675dcf3fd4b62a86910d631ebca35b72b not found: ID does not exist" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.733334 4867 scope.go:117] "RemoveContainer" containerID="f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a" Dec 01 09:11:55 crc kubenswrapper[4867]: E1201 09:11:55.734321 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\": container with ID starting with f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a not found: ID does not exist" containerID="f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.734349 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a"} err="failed to get container status \"f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\": rpc error: code = NotFound desc = could not find container \"f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a\": container with ID starting with f89d1faa6d5e990afd07ca512acf4e8256a59515d514d5c9a7d296a684d2eb8a not found: ID does not exist" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.734364 4867 scope.go:117] "RemoveContainer" containerID="b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa" Dec 01 09:11:55 crc kubenswrapper[4867]: E1201 09:11:55.734704 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\": container with ID starting with b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa not found: ID does not exist" containerID="b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.734748 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa"} err="failed to get container status \"b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\": rpc error: code = NotFound desc = could not find container \"b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa\": container with ID starting with b4b7244025b2618949e7642e1f2d8fce8153ee28e859a1cf2fb1c0bf9dfe6caa not found: ID does not exist" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.734775 4867 scope.go:117] "RemoveContainer" containerID="4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d" Dec 01 09:11:55 crc kubenswrapper[4867]: E1201 09:11:55.735197 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\": container with ID starting with 4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d not found: ID does not exist" containerID="4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.735240 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d"} err="failed to get container status \"4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\": rpc error: code = NotFound desc = could not find container \"4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d\": container with ID starting with 4434cf41c318824f4dd527e2d2ba9b5282248b19faaccc67f57f01c04cde9b8d not found: ID does not exist" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.735269 4867 scope.go:117] "RemoveContainer" containerID="453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba" Dec 01 09:11:55 crc kubenswrapper[4867]: E1201 09:11:55.735579 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\": container with ID starting with 453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba not found: ID does not exist" containerID="453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.735600 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba"} err="failed to get container status \"453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\": rpc error: code = NotFound desc = could not find container \"453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba\": container with ID starting with 453a03ad6301167aa897573a7a6719d28edecd825189e45609ac5f7b6d1b04ba not found: ID does not exist" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.846796 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.847361 4867 status_manager.go:851] "Failed to get status for pod" podUID="c55c2a62-535e-4781-a610-4eeea00a871c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.847692 4867 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.847939 4867 status_manager.go:851] "Failed to get status for pod" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" pod="openshift-marketplace/certified-operators-kfrqj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfrqj\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.848261 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.848415 4867 status_manager.go:851] "Failed to get status for pod" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" pod="openshift-marketplace/certified-operators-znqxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-znqxw\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.945867 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c55c2a62-535e-4781-a610-4eeea00a871c-kubelet-dir\") pod \"c55c2a62-535e-4781-a610-4eeea00a871c\" (UID: \"c55c2a62-535e-4781-a610-4eeea00a871c\") " Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.945918 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c55c2a62-535e-4781-a610-4eeea00a871c-kube-api-access\") pod \"c55c2a62-535e-4781-a610-4eeea00a871c\" (UID: \"c55c2a62-535e-4781-a610-4eeea00a871c\") " Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.945993 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c55c2a62-535e-4781-a610-4eeea00a871c-var-lock\") pod \"c55c2a62-535e-4781-a610-4eeea00a871c\" (UID: \"c55c2a62-535e-4781-a610-4eeea00a871c\") " Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.946176 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c55c2a62-535e-4781-a610-4eeea00a871c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c55c2a62-535e-4781-a610-4eeea00a871c" (UID: "c55c2a62-535e-4781-a610-4eeea00a871c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.946188 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c55c2a62-535e-4781-a610-4eeea00a871c-var-lock" (OuterVolumeSpecName: "var-lock") pod "c55c2a62-535e-4781-a610-4eeea00a871c" (UID: "c55c2a62-535e-4781-a610-4eeea00a871c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:11:55 crc kubenswrapper[4867]: I1201 09:11:55.953536 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c55c2a62-535e-4781-a610-4eeea00a871c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c55c2a62-535e-4781-a610-4eeea00a871c" (UID: "c55c2a62-535e-4781-a610-4eeea00a871c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.046987 4867 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c55c2a62-535e-4781-a610-4eeea00a871c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.047017 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c55c2a62-535e-4781-a610-4eeea00a871c-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.047030 4867 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c55c2a62-535e-4781-a610-4eeea00a871c-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.306850 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pwxb4" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.306906 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pwxb4" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.356362 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pwxb4" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.356930 4867 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.357402 4867 status_manager.go:851] "Failed to get status for pod" podUID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" pod="openshift-marketplace/redhat-operators-pwxb4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pwxb4\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.358264 4867 status_manager.go:851] "Failed to get status for pod" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" pod="openshift-marketplace/certified-operators-kfrqj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfrqj\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.358867 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.359435 4867 status_manager.go:851] "Failed to get status for pod" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" pod="openshift-marketplace/certified-operators-znqxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-znqxw\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.359704 4867 status_manager.go:851] "Failed to get status for pod" podUID="c55c2a62-535e-4781-a610-4eeea00a871c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.571750 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c55c2a62-535e-4781-a610-4eeea00a871c","Type":"ContainerDied","Data":"a076593fd179714b02857c8dc750f5b308aff8f49017d913c5ed7c1ba73141fc"} Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.571844 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a076593fd179714b02857c8dc750f5b308aff8f49017d913c5ed7c1ba73141fc" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.571781 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.593759 4867 status_manager.go:851] "Failed to get status for pod" podUID="c55c2a62-535e-4781-a610-4eeea00a871c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.593943 4867 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.594078 4867 status_manager.go:851] "Failed to get status for pod" podUID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" pod="openshift-marketplace/redhat-operators-pwxb4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pwxb4\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.594207 4867 status_manager.go:851] "Failed to get status for pod" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" pod="openshift-marketplace/certified-operators-kfrqj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfrqj\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.594346 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.594485 4867 status_manager.go:851] "Failed to get status for pod" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" pod="openshift-marketplace/certified-operators-znqxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-znqxw\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.614107 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pwxb4" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.614699 4867 status_manager.go:851] "Failed to get status for pod" podUID="c55c2a62-535e-4781-a610-4eeea00a871c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.615085 4867 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.615502 4867 status_manager.go:851] "Failed to get status for pod" podUID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" pod="openshift-marketplace/redhat-operators-pwxb4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pwxb4\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.615851 4867 status_manager.go:851] "Failed to get status for pod" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" pod="openshift-marketplace/certified-operators-kfrqj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfrqj\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.616137 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.616456 4867 status_manager.go:851] "Failed to get status for pod" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" pod="openshift-marketplace/certified-operators-znqxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-znqxw\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.670188 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zcq6l" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.670234 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zcq6l" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.709305 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zcq6l" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.710019 4867 status_manager.go:851] "Failed to get status for pod" podUID="c55c2a62-535e-4781-a610-4eeea00a871c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.710610 4867 status_manager.go:851] "Failed to get status for pod" podUID="c347bb68-7140-4b8d-ae43-ee55b581c961" pod="openshift-marketplace/redhat-operators-zcq6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zcq6l\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.711160 4867 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.711593 4867 status_manager.go:851] "Failed to get status for pod" podUID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" pod="openshift-marketplace/redhat-operators-pwxb4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pwxb4\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.711857 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.712065 4867 status_manager.go:851] "Failed to get status for pod" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" pod="openshift-marketplace/certified-operators-kfrqj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfrqj\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.712389 4867 status_manager.go:851] "Failed to get status for pod" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" pod="openshift-marketplace/certified-operators-znqxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-znqxw\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:56 crc kubenswrapper[4867]: I1201 09:11:56.833863 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 01 09:11:57 crc kubenswrapper[4867]: I1201 09:11:57.619048 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zcq6l" Dec 01 09:11:57 crc kubenswrapper[4867]: I1201 09:11:57.620630 4867 status_manager.go:851] "Failed to get status for pod" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" pod="openshift-marketplace/certified-operators-kfrqj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfrqj\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:57 crc kubenswrapper[4867]: I1201 09:11:57.621655 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:57 crc kubenswrapper[4867]: I1201 09:11:57.623908 4867 status_manager.go:851] "Failed to get status for pod" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" pod="openshift-marketplace/certified-operators-znqxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-znqxw\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:57 crc kubenswrapper[4867]: I1201 09:11:57.624607 4867 status_manager.go:851] "Failed to get status for pod" podUID="c55c2a62-535e-4781-a610-4eeea00a871c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:57 crc kubenswrapper[4867]: I1201 09:11:57.624805 4867 status_manager.go:851] "Failed to get status for pod" podUID="c347bb68-7140-4b8d-ae43-ee55b581c961" pod="openshift-marketplace/redhat-operators-zcq6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zcq6l\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:57 crc kubenswrapper[4867]: I1201 09:11:57.624976 4867 status_manager.go:851] "Failed to get status for pod" podUID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" pod="openshift-marketplace/redhat-operators-pwxb4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pwxb4\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:58 crc kubenswrapper[4867]: E1201 09:11:58.777068 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="6.4s" Dec 01 09:11:58 crc kubenswrapper[4867]: I1201 09:11:58.829835 4867 status_manager.go:851] "Failed to get status for pod" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" pod="openshift-marketplace/certified-operators-kfrqj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfrqj\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:58 crc kubenswrapper[4867]: I1201 09:11:58.830335 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:58 crc kubenswrapper[4867]: I1201 09:11:58.831387 4867 status_manager.go:851] "Failed to get status for pod" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" pod="openshift-marketplace/certified-operators-znqxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-znqxw\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:58 crc kubenswrapper[4867]: I1201 09:11:58.831570 4867 status_manager.go:851] "Failed to get status for pod" podUID="c55c2a62-535e-4781-a610-4eeea00a871c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:58 crc kubenswrapper[4867]: I1201 09:11:58.831718 4867 status_manager.go:851] "Failed to get status for pod" podUID="c347bb68-7140-4b8d-ae43-ee55b581c961" pod="openshift-marketplace/redhat-operators-zcq6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zcq6l\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:58 crc kubenswrapper[4867]: I1201 09:11:58.831911 4867 status_manager.go:851] "Failed to get status for pod" podUID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" pod="openshift-marketplace/redhat-operators-pwxb4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pwxb4\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:11:59 crc kubenswrapper[4867]: E1201 09:11:59.294513 4867 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.224:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d0c71f6e100c1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 09:11:52.611881153 +0000 UTC m=+234.071267907,LastTimestamp:2025-12-01 09:11:52.611881153 +0000 UTC m=+234.071267907,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 09:12:05 crc kubenswrapper[4867]: E1201 09:12:05.178456 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="7s" Dec 01 09:12:05 crc kubenswrapper[4867]: E1201 09:12:05.791117 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:12:05Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:12:05Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:12:05Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:12:05Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:20434c856c20158a4c73986bf7de93188afa338ed356d293a59f9e621072cfc3\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:24f7dab5f4a6fcbb16d41b8a7345f9f9bae2ef1e2c53abed71c4f18eeafebc85\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1605131077},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:d0d3fce260dd5b7e90c22ae8dc5f447b01e6bb9e798d0bef999ca7abcdc664c0\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:e2569f586510e07a900470ff7716df01d9a339a305ce9148d93e2a2d7a4cafe8\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1202665305},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:630899c24169faa9a7fc626551082a1681089cb03fd59ddb5fe3a5f1c01502c2\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:6ef63256e3b067a9a4a64a3243538f4274785f4209e1b0afb57bc3d784c20fad\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201245003},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ebe9f523f5c211a3a0f2570331dddcd5be15b12c1fecd9b8b121f881bfaad029\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:12:05 crc kubenswrapper[4867]: E1201 09:12:05.792426 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:12:05 crc kubenswrapper[4867]: E1201 09:12:05.793043 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:12:05 crc kubenswrapper[4867]: E1201 09:12:05.793296 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:12:05 crc kubenswrapper[4867]: E1201 09:12:05.793598 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:12:05 crc kubenswrapper[4867]: E1201 09:12:05.793709 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 09:12:05 crc kubenswrapper[4867]: I1201 09:12:05.826070 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:12:05 crc kubenswrapper[4867]: I1201 09:12:05.827111 4867 status_manager.go:851] "Failed to get status for pod" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" pod="openshift-marketplace/certified-operators-kfrqj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfrqj\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:12:05 crc kubenswrapper[4867]: I1201 09:12:05.827365 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:12:05 crc kubenswrapper[4867]: I1201 09:12:05.827564 4867 status_manager.go:851] "Failed to get status for pod" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" pod="openshift-marketplace/certified-operators-znqxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-znqxw\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:12:05 crc kubenswrapper[4867]: I1201 09:12:05.828501 4867 status_manager.go:851] "Failed to get status for pod" podUID="c55c2a62-535e-4781-a610-4eeea00a871c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:12:05 crc kubenswrapper[4867]: I1201 09:12:05.828711 4867 status_manager.go:851] "Failed to get status for pod" podUID="c347bb68-7140-4b8d-ae43-ee55b581c961" pod="openshift-marketplace/redhat-operators-zcq6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zcq6l\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:12:05 crc kubenswrapper[4867]: I1201 09:12:05.829021 4867 status_manager.go:851] "Failed to get status for pod" podUID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" pod="openshift-marketplace/redhat-operators-pwxb4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pwxb4\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:12:05 crc kubenswrapper[4867]: I1201 09:12:05.839112 4867 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="952d3740-c446-483d-805f-8c6a97cfbbd4" Dec 01 09:12:05 crc kubenswrapper[4867]: I1201 09:12:05.839151 4867 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="952d3740-c446-483d-805f-8c6a97cfbbd4" Dec 01 09:12:05 crc kubenswrapper[4867]: E1201 09:12:05.839623 4867 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:12:05 crc kubenswrapper[4867]: I1201 09:12:05.840106 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:12:05 crc kubenswrapper[4867]: W1201 09:12:05.860125 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-3237893d5cc81c48a41383b597d7225a7d7c71c5f8883c3122e6aaa57ac27a39 WatchSource:0}: Error finding container 3237893d5cc81c48a41383b597d7225a7d7c71c5f8883c3122e6aaa57ac27a39: Status 404 returned error can't find the container with id 3237893d5cc81c48a41383b597d7225a7d7c71c5f8883c3122e6aaa57ac27a39 Dec 01 09:12:06 crc kubenswrapper[4867]: I1201 09:12:06.625134 4867 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="64e8e96b376e329675dd53d3b6ba3267cc5856c42ed222186afe4f0b587b64a9" exitCode=0 Dec 01 09:12:06 crc kubenswrapper[4867]: I1201 09:12:06.625212 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"64e8e96b376e329675dd53d3b6ba3267cc5856c42ed222186afe4f0b587b64a9"} Dec 01 09:12:06 crc kubenswrapper[4867]: I1201 09:12:06.625397 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3237893d5cc81c48a41383b597d7225a7d7c71c5f8883c3122e6aaa57ac27a39"} Dec 01 09:12:06 crc kubenswrapper[4867]: I1201 09:12:06.625613 4867 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="952d3740-c446-483d-805f-8c6a97cfbbd4" Dec 01 09:12:06 crc kubenswrapper[4867]: I1201 09:12:06.625628 4867 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="952d3740-c446-483d-805f-8c6a97cfbbd4" Dec 01 09:12:06 crc kubenswrapper[4867]: E1201 09:12:06.625999 4867 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:12:06 crc kubenswrapper[4867]: I1201 09:12:06.626009 4867 status_manager.go:851] "Failed to get status for pod" podUID="c55c2a62-535e-4781-a610-4eeea00a871c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:12:06 crc kubenswrapper[4867]: I1201 09:12:06.626261 4867 status_manager.go:851] "Failed to get status for pod" podUID="c347bb68-7140-4b8d-ae43-ee55b581c961" pod="openshift-marketplace/redhat-operators-zcq6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zcq6l\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:12:06 crc kubenswrapper[4867]: I1201 09:12:06.626471 4867 status_manager.go:851] "Failed to get status for pod" podUID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" pod="openshift-marketplace/redhat-operators-pwxb4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pwxb4\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:12:06 crc kubenswrapper[4867]: I1201 09:12:06.626722 4867 status_manager.go:851] "Failed to get status for pod" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" pod="openshift-marketplace/certified-operators-kfrqj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfrqj\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:12:06 crc kubenswrapper[4867]: I1201 09:12:06.626971 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:12:06 crc kubenswrapper[4867]: I1201 09:12:06.627202 4867 status_manager.go:851] "Failed to get status for pod" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" pod="openshift-marketplace/certified-operators-znqxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-znqxw\": dial tcp 38.102.83.224:6443: connect: connection refused" Dec 01 09:12:07 crc kubenswrapper[4867]: I1201 09:12:07.639257 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 09:12:07 crc kubenswrapper[4867]: I1201 09:12:07.639597 4867 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08" exitCode=1 Dec 01 09:12:07 crc kubenswrapper[4867]: I1201 09:12:07.639653 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08"} Dec 01 09:12:07 crc kubenswrapper[4867]: I1201 09:12:07.640211 4867 scope.go:117] "RemoveContainer" containerID="86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08" Dec 01 09:12:07 crc kubenswrapper[4867]: I1201 09:12:07.685800 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c5f97e9e7594df315ad9a575aeeafcea5c2b7d38a79b742ccecea4ce62128c64"} Dec 01 09:12:07 crc kubenswrapper[4867]: I1201 09:12:07.685872 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2f2b2534f6793d7d441c8d053d7e21aa997ba30527d98398a43bc7c24299ffd2"} Dec 01 09:12:07 crc kubenswrapper[4867]: I1201 09:12:07.685887 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d0916a35fab5897d7fd981f0e3dd7365d51fcffdbc7ba00f3864fbf3edf08815"} Dec 01 09:12:07 crc kubenswrapper[4867]: I1201 09:12:07.685899 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"feec0f773538dafd4bf15991ac7451ac8398c8e245a7aee15f293f7a201078b8"} Dec 01 09:12:08 crc kubenswrapper[4867]: I1201 09:12:08.692824 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6d5617b2abb55f0e83273daf149f34746a8b3aaf9b2d7b0e4d1a838b91d012b7"} Dec 01 09:12:08 crc kubenswrapper[4867]: I1201 09:12:08.693546 4867 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="952d3740-c446-483d-805f-8c6a97cfbbd4" Dec 01 09:12:08 crc kubenswrapper[4867]: I1201 09:12:08.693613 4867 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="952d3740-c446-483d-805f-8c6a97cfbbd4" Dec 01 09:12:08 crc kubenswrapper[4867]: I1201 09:12:08.693584 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:12:08 crc kubenswrapper[4867]: I1201 09:12:08.696350 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 09:12:08 crc kubenswrapper[4867]: I1201 09:12:08.696467 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3bef36a5bcde2a09452d7fa49400a4f9ee7e5a03ebf88bca6dc4a8d58afde9ec"} Dec 01 09:12:08 crc kubenswrapper[4867]: I1201 09:12:08.861134 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:12:10 crc kubenswrapper[4867]: I1201 09:12:10.840994 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:12:10 crc kubenswrapper[4867]: I1201 09:12:10.841050 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:12:10 crc kubenswrapper[4867]: I1201 09:12:10.849350 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:12:13 crc kubenswrapper[4867]: I1201 09:12:13.703460 4867 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:12:13 crc kubenswrapper[4867]: I1201 09:12:13.729074 4867 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="952d3740-c446-483d-805f-8c6a97cfbbd4" Dec 01 09:12:13 crc kubenswrapper[4867]: I1201 09:12:13.729110 4867 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="952d3740-c446-483d-805f-8c6a97cfbbd4" Dec 01 09:12:13 crc kubenswrapper[4867]: I1201 09:12:13.735344 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:12:13 crc kubenswrapper[4867]: I1201 09:12:13.738184 4867 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="761aa1a4-ce36-4c7c-a735-c13224448a69" Dec 01 09:12:14 crc kubenswrapper[4867]: I1201 09:12:14.734463 4867 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="952d3740-c446-483d-805f-8c6a97cfbbd4" Dec 01 09:12:14 crc kubenswrapper[4867]: I1201 09:12:14.734492 4867 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="952d3740-c446-483d-805f-8c6a97cfbbd4" Dec 01 09:12:16 crc kubenswrapper[4867]: I1201 09:12:16.970773 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:12:16 crc kubenswrapper[4867]: I1201 09:12:16.971039 4867 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 01 09:12:16 crc kubenswrapper[4867]: I1201 09:12:16.971383 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 01 09:12:18 crc kubenswrapper[4867]: I1201 09:12:18.858249 4867 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="761aa1a4-ce36-4c7c-a735-c13224448a69" Dec 01 09:12:23 crc kubenswrapper[4867]: I1201 09:12:23.023496 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 09:12:23 crc kubenswrapper[4867]: I1201 09:12:23.812717 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 09:12:24 crc kubenswrapper[4867]: I1201 09:12:24.177800 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 09:12:24 crc kubenswrapper[4867]: I1201 09:12:24.445712 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 09:12:24 crc kubenswrapper[4867]: I1201 09:12:24.467679 4867 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 09:12:24 crc kubenswrapper[4867]: I1201 09:12:24.532918 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 09:12:24 crc kubenswrapper[4867]: I1201 09:12:24.726646 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 09:12:25 crc kubenswrapper[4867]: I1201 09:12:25.003059 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 09:12:25 crc kubenswrapper[4867]: I1201 09:12:25.398823 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 09:12:25 crc kubenswrapper[4867]: I1201 09:12:25.401003 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 09:12:25 crc kubenswrapper[4867]: I1201 09:12:25.575503 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 09:12:25 crc kubenswrapper[4867]: I1201 09:12:25.672509 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 09:12:25 crc kubenswrapper[4867]: I1201 09:12:25.701506 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 09:12:25 crc kubenswrapper[4867]: I1201 09:12:25.928346 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 09:12:26 crc kubenswrapper[4867]: I1201 09:12:26.144359 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 09:12:26 crc kubenswrapper[4867]: I1201 09:12:26.220831 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 09:12:26 crc kubenswrapper[4867]: I1201 09:12:26.420615 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 09:12:26 crc kubenswrapper[4867]: I1201 09:12:26.551277 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 09:12:26 crc kubenswrapper[4867]: I1201 09:12:26.734807 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 09:12:26 crc kubenswrapper[4867]: I1201 09:12:26.792734 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 09:12:26 crc kubenswrapper[4867]: I1201 09:12:26.801554 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 09:12:26 crc kubenswrapper[4867]: I1201 09:12:26.971712 4867 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 01 09:12:26 crc kubenswrapper[4867]: I1201 09:12:26.971799 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 01 09:12:27 crc kubenswrapper[4867]: I1201 09:12:27.020541 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 09:12:27 crc kubenswrapper[4867]: I1201 09:12:27.037108 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 09:12:27 crc kubenswrapper[4867]: I1201 09:12:27.191322 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 09:12:27 crc kubenswrapper[4867]: I1201 09:12:27.206045 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 09:12:27 crc kubenswrapper[4867]: I1201 09:12:27.355437 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 09:12:27 crc kubenswrapper[4867]: I1201 09:12:27.569561 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 09:12:27 crc kubenswrapper[4867]: I1201 09:12:27.580358 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 09:12:27 crc kubenswrapper[4867]: I1201 09:12:27.743084 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 09:12:27 crc kubenswrapper[4867]: I1201 09:12:27.763972 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 09:12:27 crc kubenswrapper[4867]: I1201 09:12:27.783635 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 09:12:27 crc kubenswrapper[4867]: I1201 09:12:27.823115 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 09:12:27 crc kubenswrapper[4867]: I1201 09:12:27.855860 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 09:12:27 crc kubenswrapper[4867]: I1201 09:12:27.859586 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 09:12:27 crc kubenswrapper[4867]: I1201 09:12:27.873918 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 09:12:27 crc kubenswrapper[4867]: I1201 09:12:27.932536 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 09:12:27 crc kubenswrapper[4867]: I1201 09:12:27.940041 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 09:12:27 crc kubenswrapper[4867]: I1201 09:12:27.949069 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.022908 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.053230 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.060264 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.177676 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.180507 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.316050 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.328592 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.334806 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.335623 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.403073 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.512952 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.564322 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.571531 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.573609 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.635138 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.643492 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.668995 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.707948 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.869448 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.896362 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 09:12:28 crc kubenswrapper[4867]: I1201 09:12:28.966961 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.082415 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.095641 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.104538 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.189195 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.219882 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.271004 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.385379 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.399036 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.446666 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.514511 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.525509 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.544900 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.601890 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.604740 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.618482 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.618506 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.645885 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.671011 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.708688 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.751283 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.774536 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.829727 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 09:12:29 crc kubenswrapper[4867]: I1201 09:12:29.888463 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.101894 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.108990 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.120413 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.184496 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.193888 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.252949 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.336319 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.404136 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.459073 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.564216 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.594572 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.597886 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.724577 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.739519 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.743482 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.765596 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.779373 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.809675 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.816262 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.827022 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.847415 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.877092 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 09:12:30 crc kubenswrapper[4867]: I1201 09:12:30.935920 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.064792 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.125913 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.207863 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.237473 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.267924 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.332016 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.407514 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.470670 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.481927 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.522333 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.528639 4867 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.531191 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=39.531176876 podStartE2EDuration="39.531176876s" podCreationTimestamp="2025-12-01 09:11:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:12:13.492633584 +0000 UTC m=+254.952020328" watchObservedRunningTime="2025-12-01 09:12:31.531176876 +0000 UTC m=+272.990563630" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.532889 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.532934 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.533374 4867 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="952d3740-c446-483d-805f-8c6a97cfbbd4" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.533413 4867 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="952d3740-c446-483d-805f-8c6a97cfbbd4" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.537152 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.554920 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=18.554895884 podStartE2EDuration="18.554895884s" podCreationTimestamp="2025-12-01 09:12:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:12:31.553008483 +0000 UTC m=+273.012395247" watchObservedRunningTime="2025-12-01 09:12:31.554895884 +0000 UTC m=+273.014282678" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.626722 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.631143 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.733423 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.771732 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.801857 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.830876 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.903128 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 09:12:31 crc kubenswrapper[4867]: I1201 09:12:31.983962 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 09:12:32 crc kubenswrapper[4867]: I1201 09:12:32.005349 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 09:12:32 crc kubenswrapper[4867]: I1201 09:12:32.025293 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 09:12:32 crc kubenswrapper[4867]: I1201 09:12:32.073740 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 09:12:32 crc kubenswrapper[4867]: I1201 09:12:32.077449 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 09:12:32 crc kubenswrapper[4867]: I1201 09:12:32.156691 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 09:12:32 crc kubenswrapper[4867]: I1201 09:12:32.251892 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 09:12:32 crc kubenswrapper[4867]: I1201 09:12:32.283678 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 09:12:32 crc kubenswrapper[4867]: I1201 09:12:32.355310 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 09:12:32 crc kubenswrapper[4867]: I1201 09:12:32.491077 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 09:12:32 crc kubenswrapper[4867]: I1201 09:12:32.512918 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 09:12:32 crc kubenswrapper[4867]: I1201 09:12:32.647247 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 09:12:32 crc kubenswrapper[4867]: I1201 09:12:32.652126 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 09:12:32 crc kubenswrapper[4867]: I1201 09:12:32.661979 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 09:12:32 crc kubenswrapper[4867]: I1201 09:12:32.693060 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 09:12:32 crc kubenswrapper[4867]: I1201 09:12:32.755656 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 09:12:32 crc kubenswrapper[4867]: I1201 09:12:32.767381 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 09:12:32 crc kubenswrapper[4867]: I1201 09:12:32.798148 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 09:12:32 crc kubenswrapper[4867]: I1201 09:12:32.887944 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 09:12:32 crc kubenswrapper[4867]: I1201 09:12:32.974067 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.035504 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.082050 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.115560 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.147737 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.322925 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.372909 4867 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.373057 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.384444 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.431709 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.452737 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.453774 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.492489 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.497920 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.551230 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.576977 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.625762 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.669727 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.676657 4867 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.697963 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.825611 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.833363 4867 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.865537 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 09:12:33 crc kubenswrapper[4867]: I1201 09:12:33.943383 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 09:12:34 crc kubenswrapper[4867]: I1201 09:12:34.062160 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 09:12:34 crc kubenswrapper[4867]: I1201 09:12:34.072344 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 09:12:34 crc kubenswrapper[4867]: I1201 09:12:34.088346 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 09:12:34 crc kubenswrapper[4867]: I1201 09:12:34.145860 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 09:12:34 crc kubenswrapper[4867]: I1201 09:12:34.204187 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 09:12:34 crc kubenswrapper[4867]: I1201 09:12:34.232085 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 09:12:34 crc kubenswrapper[4867]: I1201 09:12:34.232592 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 09:12:34 crc kubenswrapper[4867]: I1201 09:12:34.297066 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 09:12:34 crc kubenswrapper[4867]: I1201 09:12:34.308619 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 09:12:34 crc kubenswrapper[4867]: I1201 09:12:34.368516 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 09:12:34 crc kubenswrapper[4867]: I1201 09:12:34.388643 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 09:12:34 crc kubenswrapper[4867]: I1201 09:12:34.408026 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 09:12:34 crc kubenswrapper[4867]: I1201 09:12:34.435695 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 09:12:34 crc kubenswrapper[4867]: I1201 09:12:34.566964 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 09:12:34 crc kubenswrapper[4867]: I1201 09:12:34.713187 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 09:12:34 crc kubenswrapper[4867]: I1201 09:12:34.852403 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 09:12:34 crc kubenswrapper[4867]: I1201 09:12:34.857527 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 09:12:34 crc kubenswrapper[4867]: I1201 09:12:34.858061 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 09:12:34 crc kubenswrapper[4867]: I1201 09:12:34.863849 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 09:12:34 crc kubenswrapper[4867]: I1201 09:12:34.970193 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 09:12:35 crc kubenswrapper[4867]: I1201 09:12:35.035186 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 09:12:35 crc kubenswrapper[4867]: I1201 09:12:35.055219 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 09:12:35 crc kubenswrapper[4867]: I1201 09:12:35.149771 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 09:12:35 crc kubenswrapper[4867]: I1201 09:12:35.210918 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 09:12:35 crc kubenswrapper[4867]: I1201 09:12:35.422116 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 09:12:35 crc kubenswrapper[4867]: I1201 09:12:35.442550 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 09:12:35 crc kubenswrapper[4867]: I1201 09:12:35.449758 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 09:12:35 crc kubenswrapper[4867]: I1201 09:12:35.465975 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 09:12:35 crc kubenswrapper[4867]: I1201 09:12:35.525048 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 09:12:35 crc kubenswrapper[4867]: I1201 09:12:35.542835 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 09:12:35 crc kubenswrapper[4867]: I1201 09:12:35.603918 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 09:12:35 crc kubenswrapper[4867]: I1201 09:12:35.605156 4867 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 09:12:35 crc kubenswrapper[4867]: I1201 09:12:35.681306 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 09:12:35 crc kubenswrapper[4867]: I1201 09:12:35.761478 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 09:12:35 crc kubenswrapper[4867]: I1201 09:12:35.800354 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 09:12:35 crc kubenswrapper[4867]: I1201 09:12:35.824971 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 09:12:35 crc kubenswrapper[4867]: I1201 09:12:35.876109 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 09:12:35 crc kubenswrapper[4867]: I1201 09:12:35.889105 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 09:12:35 crc kubenswrapper[4867]: I1201 09:12:35.912718 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 09:12:35 crc kubenswrapper[4867]: I1201 09:12:35.948148 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 09:12:36 crc kubenswrapper[4867]: I1201 09:12:36.091851 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 09:12:36 crc kubenswrapper[4867]: I1201 09:12:36.125745 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 09:12:36 crc kubenswrapper[4867]: I1201 09:12:36.173543 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 09:12:36 crc kubenswrapper[4867]: I1201 09:12:36.204037 4867 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 09:12:36 crc kubenswrapper[4867]: I1201 09:12:36.204291 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://8175543ba001df0f0808ba8c329af9dff180dd3b011a61d43e5c40948a6a5dc1" gracePeriod=5 Dec 01 09:12:36 crc kubenswrapper[4867]: I1201 09:12:36.303490 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 09:12:36 crc kubenswrapper[4867]: I1201 09:12:36.352292 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 09:12:36 crc kubenswrapper[4867]: I1201 09:12:36.378086 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 09:12:36 crc kubenswrapper[4867]: I1201 09:12:36.449347 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 09:12:36 crc kubenswrapper[4867]: I1201 09:12:36.450374 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 09:12:36 crc kubenswrapper[4867]: I1201 09:12:36.595927 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 09:12:36 crc kubenswrapper[4867]: I1201 09:12:36.628340 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 09:12:36 crc kubenswrapper[4867]: I1201 09:12:36.674121 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 09:12:36 crc kubenswrapper[4867]: I1201 09:12:36.698588 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 09:12:36 crc kubenswrapper[4867]: I1201 09:12:36.752455 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 09:12:36 crc kubenswrapper[4867]: I1201 09:12:36.971519 4867 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 01 09:12:36 crc kubenswrapper[4867]: I1201 09:12:36.972424 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 01 09:12:36 crc kubenswrapper[4867]: I1201 09:12:36.972525 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:12:36 crc kubenswrapper[4867]: I1201 09:12:36.975128 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"3bef36a5bcde2a09452d7fa49400a4f9ee7e5a03ebf88bca6dc4a8d58afde9ec"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 01 09:12:36 crc kubenswrapper[4867]: I1201 09:12:36.975778 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://3bef36a5bcde2a09452d7fa49400a4f9ee7e5a03ebf88bca6dc4a8d58afde9ec" gracePeriod=30 Dec 01 09:12:37 crc kubenswrapper[4867]: I1201 09:12:37.007503 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 09:12:37 crc kubenswrapper[4867]: I1201 09:12:37.061102 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 09:12:37 crc kubenswrapper[4867]: I1201 09:12:37.177475 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 09:12:37 crc kubenswrapper[4867]: I1201 09:12:37.212684 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 09:12:37 crc kubenswrapper[4867]: I1201 09:12:37.221840 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 09:12:37 crc kubenswrapper[4867]: I1201 09:12:37.349071 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 09:12:37 crc kubenswrapper[4867]: I1201 09:12:37.644492 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 09:12:37 crc kubenswrapper[4867]: I1201 09:12:37.690258 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 09:12:37 crc kubenswrapper[4867]: I1201 09:12:37.690314 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 09:12:37 crc kubenswrapper[4867]: I1201 09:12:37.795328 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 09:12:37 crc kubenswrapper[4867]: I1201 09:12:37.963438 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 09:12:38 crc kubenswrapper[4867]: I1201 09:12:38.207108 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 09:12:38 crc kubenswrapper[4867]: I1201 09:12:38.307161 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 09:12:38 crc kubenswrapper[4867]: I1201 09:12:38.428619 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 09:12:38 crc kubenswrapper[4867]: I1201 09:12:38.476347 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 09:12:38 crc kubenswrapper[4867]: I1201 09:12:38.525158 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 09:12:38 crc kubenswrapper[4867]: I1201 09:12:38.684608 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 09:12:38 crc kubenswrapper[4867]: I1201 09:12:38.725092 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 09:12:38 crc kubenswrapper[4867]: I1201 09:12:38.726416 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 09:12:38 crc kubenswrapper[4867]: I1201 09:12:38.731627 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 09:12:38 crc kubenswrapper[4867]: I1201 09:12:38.760581 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 09:12:38 crc kubenswrapper[4867]: I1201 09:12:38.843622 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 09:12:38 crc kubenswrapper[4867]: I1201 09:12:38.881355 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 09:12:38 crc kubenswrapper[4867]: I1201 09:12:38.891726 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 09:12:39 crc kubenswrapper[4867]: I1201 09:12:39.151484 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 09:12:39 crc kubenswrapper[4867]: I1201 09:12:39.185620 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 09:12:39 crc kubenswrapper[4867]: I1201 09:12:39.257457 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 09:12:39 crc kubenswrapper[4867]: I1201 09:12:39.264658 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 09:12:39 crc kubenswrapper[4867]: I1201 09:12:39.549079 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 09:12:39 crc kubenswrapper[4867]: I1201 09:12:39.637462 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 09:12:40 crc kubenswrapper[4867]: I1201 09:12:40.245940 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 09:12:40 crc kubenswrapper[4867]: I1201 09:12:40.688225 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.200799 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.773180 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.773598 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.870349 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.885188 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.910183 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.910258 4867 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="8175543ba001df0f0808ba8c329af9dff180dd3b011a61d43e5c40948a6a5dc1" exitCode=137 Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.910300 4867 scope.go:117] "RemoveContainer" containerID="8175543ba001df0f0808ba8c329af9dff180dd3b011a61d43e5c40948a6a5dc1" Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.910403 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.929204 4867 scope.go:117] "RemoveContainer" containerID="8175543ba001df0f0808ba8c329af9dff180dd3b011a61d43e5c40948a6a5dc1" Dec 01 09:12:41 crc kubenswrapper[4867]: E1201 09:12:41.929887 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8175543ba001df0f0808ba8c329af9dff180dd3b011a61d43e5c40948a6a5dc1\": container with ID starting with 8175543ba001df0f0808ba8c329af9dff180dd3b011a61d43e5c40948a6a5dc1 not found: ID does not exist" containerID="8175543ba001df0f0808ba8c329af9dff180dd3b011a61d43e5c40948a6a5dc1" Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.929934 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8175543ba001df0f0808ba8c329af9dff180dd3b011a61d43e5c40948a6a5dc1"} err="failed to get container status \"8175543ba001df0f0808ba8c329af9dff180dd3b011a61d43e5c40948a6a5dc1\": rpc error: code = NotFound desc = could not find container \"8175543ba001df0f0808ba8c329af9dff180dd3b011a61d43e5c40948a6a5dc1\": container with ID starting with 8175543ba001df0f0808ba8c329af9dff180dd3b011a61d43e5c40948a6a5dc1 not found: ID does not exist" Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.971762 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.971873 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.971912 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.971973 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.971989 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.972047 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.972064 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.972150 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.972359 4867 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.972382 4867 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.972403 4867 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.972419 4867 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:41 crc kubenswrapper[4867]: I1201 09:12:41.972437 4867 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:42 crc kubenswrapper[4867]: I1201 09:12:42.833094 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 01 09:12:42 crc kubenswrapper[4867]: I1201 09:12:42.833390 4867 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 01 09:12:42 crc kubenswrapper[4867]: I1201 09:12:42.845740 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 09:12:42 crc kubenswrapper[4867]: I1201 09:12:42.845772 4867 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="dbb46427-47cc-44bd-9841-a7a187eb76ab" Dec 01 09:12:42 crc kubenswrapper[4867]: I1201 09:12:42.852343 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 09:12:42 crc kubenswrapper[4867]: I1201 09:12:42.852391 4867 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="dbb46427-47cc-44bd-9841-a7a187eb76ab" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.204869 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kfrqj"] Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.207203 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kfrqj" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" containerName="registry-server" containerID="cri-o://1d2d7d5a44d1d3a7123da638c13be7ad5c902e7bc8b2c338944b5dfa2f632e8a" gracePeriod=30 Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.211266 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-znqxw"] Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.211511 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-znqxw" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" containerName="registry-server" containerID="cri-o://66f93ba6d91139b2365b929e38ed7b88f49f8584eb586157d59ace4ca2f72b4f" gracePeriod=30 Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.221747 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5wrtl"] Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.222091 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5wrtl" podUID="4702840c-d6fa-4dcd-bd95-4ac89f95d727" containerName="registry-server" containerID="cri-o://305f9637cc21bccb7f075fb3fb9e8a1c9c9912d07256a0e65d68bc36458df8f4" gracePeriod=30 Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.236295 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6w6sm"] Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.236503 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" podUID="c7d3f2ef-022b-41f5-84e5-6be42f48b023" containerName="marketplace-operator" containerID="cri-o://b066b91b18ccb5da1e9ab6c2316b6294409b3b18f2276443c434778248a1dc18" gracePeriod=30 Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.241858 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nv8tc"] Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.242254 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nv8tc" podUID="545d34e2-c5a8-48ab-9603-9ae4986ab739" containerName="registry-server" containerID="cri-o://29956ba0090af6092eaa556a0025578e62ca428457185a659067375b7932461a" gracePeriod=30 Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.246481 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pwxb4"] Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.246837 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pwxb4" podUID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" containerName="registry-server" containerID="cri-o://9156f251355a6c80dd8a0a065c59e86c6aaf74000cc46e243116cbc7c06dac4a" gracePeriod=30 Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.251775 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zcq6l"] Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.253858 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zcq6l" podUID="c347bb68-7140-4b8d-ae43-ee55b581c961" containerName="registry-server" containerID="cri-o://3a3583fd43edc7e45de2907c196a22858c6a71063fa347ad12de6fbc902e18fb" gracePeriod=30 Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.622546 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfrqj" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.674529 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-znqxw" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.739350 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pwxb4" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.749687 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wrtl" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.750674 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nv8tc" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.755119 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zcq6l" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.762644 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.806646 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a77c808-2899-42d0-95e6-72f00df5432f-catalog-content\") pod \"5a77c808-2899-42d0-95e6-72f00df5432f\" (UID: \"5a77c808-2899-42d0-95e6-72f00df5432f\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.806729 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6f7e7b3-ad0a-41ab-8291-c5200fe31a88-utilities\") pod \"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88\" (UID: \"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.806755 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h4gh\" (UniqueName: \"kubernetes.io/projected/5a77c808-2899-42d0-95e6-72f00df5432f-kube-api-access-8h4gh\") pod \"5a77c808-2899-42d0-95e6-72f00df5432f\" (UID: \"5a77c808-2899-42d0-95e6-72f00df5432f\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.806776 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a77c808-2899-42d0-95e6-72f00df5432f-utilities\") pod \"5a77c808-2899-42d0-95e6-72f00df5432f\" (UID: \"5a77c808-2899-42d0-95e6-72f00df5432f\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.806841 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6f7e7b3-ad0a-41ab-8291-c5200fe31a88-catalog-content\") pod \"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88\" (UID: \"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.806880 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkgmj\" (UniqueName: \"kubernetes.io/projected/d6f7e7b3-ad0a-41ab-8291-c5200fe31a88-kube-api-access-vkgmj\") pod \"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88\" (UID: \"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.813093 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6f7e7b3-ad0a-41ab-8291-c5200fe31a88-kube-api-access-vkgmj" (OuterVolumeSpecName: "kube-api-access-vkgmj") pod "d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" (UID: "d6f7e7b3-ad0a-41ab-8291-c5200fe31a88"). InnerVolumeSpecName "kube-api-access-vkgmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.813758 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a77c808-2899-42d0-95e6-72f00df5432f-utilities" (OuterVolumeSpecName: "utilities") pod "5a77c808-2899-42d0-95e6-72f00df5432f" (UID: "5a77c808-2899-42d0-95e6-72f00df5432f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.815407 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6f7e7b3-ad0a-41ab-8291-c5200fe31a88-utilities" (OuterVolumeSpecName: "utilities") pod "d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" (UID: "d6f7e7b3-ad0a-41ab-8291-c5200fe31a88"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.816112 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a77c808-2899-42d0-95e6-72f00df5432f-kube-api-access-8h4gh" (OuterVolumeSpecName: "kube-api-access-8h4gh") pod "5a77c808-2899-42d0-95e6-72f00df5432f" (UID: "5a77c808-2899-42d0-95e6-72f00df5432f"). InnerVolumeSpecName "kube-api-access-8h4gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.860544 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a77c808-2899-42d0-95e6-72f00df5432f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a77c808-2899-42d0-95e6-72f00df5432f" (UID: "5a77c808-2899-42d0-95e6-72f00df5432f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.861538 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6f7e7b3-ad0a-41ab-8291-c5200fe31a88-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" (UID: "d6f7e7b3-ad0a-41ab-8291-c5200fe31a88"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.908713 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9f8ccf-fbe9-42b0-84e5-b1913365a11f-utilities\") pod \"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f\" (UID: \"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.908786 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/545d34e2-c5a8-48ab-9603-9ae4986ab739-catalog-content\") pod \"545d34e2-c5a8-48ab-9603-9ae4986ab739\" (UID: \"545d34e2-c5a8-48ab-9603-9ae4986ab739\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.908871 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65b95\" (UniqueName: \"kubernetes.io/projected/4702840c-d6fa-4dcd-bd95-4ac89f95d727-kube-api-access-65b95\") pod \"4702840c-d6fa-4dcd-bd95-4ac89f95d727\" (UID: \"4702840c-d6fa-4dcd-bd95-4ac89f95d727\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.908922 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z7p5\" (UniqueName: \"kubernetes.io/projected/c7d3f2ef-022b-41f5-84e5-6be42f48b023-kube-api-access-7z7p5\") pod \"c7d3f2ef-022b-41f5-84e5-6be42f48b023\" (UID: \"c7d3f2ef-022b-41f5-84e5-6be42f48b023\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.908967 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfsbs\" (UniqueName: \"kubernetes.io/projected/545d34e2-c5a8-48ab-9603-9ae4986ab739-kube-api-access-rfsbs\") pod \"545d34e2-c5a8-48ab-9603-9ae4986ab739\" (UID: \"545d34e2-c5a8-48ab-9603-9ae4986ab739\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.908990 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsjmq\" (UniqueName: \"kubernetes.io/projected/8d9f8ccf-fbe9-42b0-84e5-b1913365a11f-kube-api-access-fsjmq\") pod \"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f\" (UID: \"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.909020 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7d3f2ef-022b-41f5-84e5-6be42f48b023-marketplace-operator-metrics\") pod \"c7d3f2ef-022b-41f5-84e5-6be42f48b023\" (UID: \"c7d3f2ef-022b-41f5-84e5-6be42f48b023\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.909037 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/545d34e2-c5a8-48ab-9603-9ae4986ab739-utilities\") pod \"545d34e2-c5a8-48ab-9603-9ae4986ab739\" (UID: \"545d34e2-c5a8-48ab-9603-9ae4986ab739\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.909074 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7d3f2ef-022b-41f5-84e5-6be42f48b023-marketplace-trusted-ca\") pod \"c7d3f2ef-022b-41f5-84e5-6be42f48b023\" (UID: \"c7d3f2ef-022b-41f5-84e5-6be42f48b023\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.909123 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghmkb\" (UniqueName: \"kubernetes.io/projected/c347bb68-7140-4b8d-ae43-ee55b581c961-kube-api-access-ghmkb\") pod \"c347bb68-7140-4b8d-ae43-ee55b581c961\" (UID: \"c347bb68-7140-4b8d-ae43-ee55b581c961\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.909166 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4702840c-d6fa-4dcd-bd95-4ac89f95d727-utilities\") pod \"4702840c-d6fa-4dcd-bd95-4ac89f95d727\" (UID: \"4702840c-d6fa-4dcd-bd95-4ac89f95d727\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.909193 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c347bb68-7140-4b8d-ae43-ee55b581c961-catalog-content\") pod \"c347bb68-7140-4b8d-ae43-ee55b581c961\" (UID: \"c347bb68-7140-4b8d-ae43-ee55b581c961\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.909228 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c347bb68-7140-4b8d-ae43-ee55b581c961-utilities\") pod \"c347bb68-7140-4b8d-ae43-ee55b581c961\" (UID: \"c347bb68-7140-4b8d-ae43-ee55b581c961\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.909256 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4702840c-d6fa-4dcd-bd95-4ac89f95d727-catalog-content\") pod \"4702840c-d6fa-4dcd-bd95-4ac89f95d727\" (UID: \"4702840c-d6fa-4dcd-bd95-4ac89f95d727\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.909277 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9f8ccf-fbe9-42b0-84e5-b1913365a11f-catalog-content\") pod \"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f\" (UID: \"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f\") " Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.909674 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6f7e7b3-ad0a-41ab-8291-c5200fe31a88-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.909688 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkgmj\" (UniqueName: \"kubernetes.io/projected/d6f7e7b3-ad0a-41ab-8291-c5200fe31a88-kube-api-access-vkgmj\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.909700 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a77c808-2899-42d0-95e6-72f00df5432f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.909710 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6f7e7b3-ad0a-41ab-8291-c5200fe31a88-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.909721 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h4gh\" (UniqueName: \"kubernetes.io/projected/5a77c808-2899-42d0-95e6-72f00df5432f-kube-api-access-8h4gh\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.909731 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a77c808-2899-42d0-95e6-72f00df5432f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.911447 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d9f8ccf-fbe9-42b0-84e5-b1913365a11f-utilities" (OuterVolumeSpecName: "utilities") pod "8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" (UID: "8d9f8ccf-fbe9-42b0-84e5-b1913365a11f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.911854 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4702840c-d6fa-4dcd-bd95-4ac89f95d727-utilities" (OuterVolumeSpecName: "utilities") pod "4702840c-d6fa-4dcd-bd95-4ac89f95d727" (UID: "4702840c-d6fa-4dcd-bd95-4ac89f95d727"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.912168 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c347bb68-7140-4b8d-ae43-ee55b581c961-utilities" (OuterVolumeSpecName: "utilities") pod "c347bb68-7140-4b8d-ae43-ee55b581c961" (UID: "c347bb68-7140-4b8d-ae43-ee55b581c961"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.913109 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7d3f2ef-022b-41f5-84e5-6be42f48b023-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "c7d3f2ef-022b-41f5-84e5-6be42f48b023" (UID: "c7d3f2ef-022b-41f5-84e5-6be42f48b023"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.913787 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/545d34e2-c5a8-48ab-9603-9ae4986ab739-utilities" (OuterVolumeSpecName: "utilities") pod "545d34e2-c5a8-48ab-9603-9ae4986ab739" (UID: "545d34e2-c5a8-48ab-9603-9ae4986ab739"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.914231 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d3f2ef-022b-41f5-84e5-6be42f48b023-kube-api-access-7z7p5" (OuterVolumeSpecName: "kube-api-access-7z7p5") pod "c7d3f2ef-022b-41f5-84e5-6be42f48b023" (UID: "c7d3f2ef-022b-41f5-84e5-6be42f48b023"). InnerVolumeSpecName "kube-api-access-7z7p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.914975 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/545d34e2-c5a8-48ab-9603-9ae4986ab739-kube-api-access-rfsbs" (OuterVolumeSpecName: "kube-api-access-rfsbs") pod "545d34e2-c5a8-48ab-9603-9ae4986ab739" (UID: "545d34e2-c5a8-48ab-9603-9ae4986ab739"). InnerVolumeSpecName "kube-api-access-rfsbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.915207 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d9f8ccf-fbe9-42b0-84e5-b1913365a11f-kube-api-access-fsjmq" (OuterVolumeSpecName: "kube-api-access-fsjmq") pod "8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" (UID: "8d9f8ccf-fbe9-42b0-84e5-b1913365a11f"). InnerVolumeSpecName "kube-api-access-fsjmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.917713 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d3f2ef-022b-41f5-84e5-6be42f48b023-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "c7d3f2ef-022b-41f5-84e5-6be42f48b023" (UID: "c7d3f2ef-022b-41f5-84e5-6be42f48b023"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.918455 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c347bb68-7140-4b8d-ae43-ee55b581c961-kube-api-access-ghmkb" (OuterVolumeSpecName: "kube-api-access-ghmkb") pod "c347bb68-7140-4b8d-ae43-ee55b581c961" (UID: "c347bb68-7140-4b8d-ae43-ee55b581c961"). InnerVolumeSpecName "kube-api-access-ghmkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.919049 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4702840c-d6fa-4dcd-bd95-4ac89f95d727-kube-api-access-65b95" (OuterVolumeSpecName: "kube-api-access-65b95") pod "4702840c-d6fa-4dcd-bd95-4ac89f95d727" (UID: "4702840c-d6fa-4dcd-bd95-4ac89f95d727"). InnerVolumeSpecName "kube-api-access-65b95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.931258 4867 generic.go:334] "Generic (PLEG): container finished" podID="c347bb68-7140-4b8d-ae43-ee55b581c961" containerID="3a3583fd43edc7e45de2907c196a22858c6a71063fa347ad12de6fbc902e18fb" exitCode=0 Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.931524 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcq6l" event={"ID":"c347bb68-7140-4b8d-ae43-ee55b581c961","Type":"ContainerDied","Data":"3a3583fd43edc7e45de2907c196a22858c6a71063fa347ad12de6fbc902e18fb"} Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.931644 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcq6l" event={"ID":"c347bb68-7140-4b8d-ae43-ee55b581c961","Type":"ContainerDied","Data":"fa89d4da9867e4f58df7a5818221a7b3afd90ad080d230d250fd8528e489b0e2"} Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.931762 4867 scope.go:117] "RemoveContainer" containerID="3a3583fd43edc7e45de2907c196a22858c6a71063fa347ad12de6fbc902e18fb" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.932013 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zcq6l" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.936398 4867 generic.go:334] "Generic (PLEG): container finished" podID="4702840c-d6fa-4dcd-bd95-4ac89f95d727" containerID="305f9637cc21bccb7f075fb3fb9e8a1c9c9912d07256a0e65d68bc36458df8f4" exitCode=0 Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.936584 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wrtl" event={"ID":"4702840c-d6fa-4dcd-bd95-4ac89f95d727","Type":"ContainerDied","Data":"305f9637cc21bccb7f075fb3fb9e8a1c9c9912d07256a0e65d68bc36458df8f4"} Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.936698 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wrtl" event={"ID":"4702840c-d6fa-4dcd-bd95-4ac89f95d727","Type":"ContainerDied","Data":"9f6a4d8538d9c6fff75c6a7f89a7bc8e7f85f9efa2fd7e4656a27452b29b0a6b"} Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.936948 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wrtl" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.941195 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/545d34e2-c5a8-48ab-9603-9ae4986ab739-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "545d34e2-c5a8-48ab-9603-9ae4986ab739" (UID: "545d34e2-c5a8-48ab-9603-9ae4986ab739"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.941887 4867 generic.go:334] "Generic (PLEG): container finished" podID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" containerID="66f93ba6d91139b2365b929e38ed7b88f49f8584eb586157d59ace4ca2f72b4f" exitCode=0 Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.941961 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-znqxw" event={"ID":"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88","Type":"ContainerDied","Data":"66f93ba6d91139b2365b929e38ed7b88f49f8584eb586157d59ace4ca2f72b4f"} Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.941986 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-znqxw" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.941990 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-znqxw" event={"ID":"d6f7e7b3-ad0a-41ab-8291-c5200fe31a88","Type":"ContainerDied","Data":"84f3a573996ef0f968e1a72c428fb33ef4b9a59ad4e8efa3205e1b189e419344"} Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.950132 4867 generic.go:334] "Generic (PLEG): container finished" podID="5a77c808-2899-42d0-95e6-72f00df5432f" containerID="1d2d7d5a44d1d3a7123da638c13be7ad5c902e7bc8b2c338944b5dfa2f632e8a" exitCode=0 Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.950357 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfrqj" event={"ID":"5a77c808-2899-42d0-95e6-72f00df5432f","Type":"ContainerDied","Data":"1d2d7d5a44d1d3a7123da638c13be7ad5c902e7bc8b2c338944b5dfa2f632e8a"} Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.950497 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfrqj" event={"ID":"5a77c808-2899-42d0-95e6-72f00df5432f","Type":"ContainerDied","Data":"2feef0f2d0d02f1d63d461d746b5bf27de96e8b42cdbad16b0aa35e9ed4f9693"} Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.950669 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfrqj" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.956950 4867 scope.go:117] "RemoveContainer" containerID="75f9ed8a8c3a401d40aa5a13a290ee3039cfade80d33de9ccf450491d0718a38" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.958258 4867 generic.go:334] "Generic (PLEG): container finished" podID="c7d3f2ef-022b-41f5-84e5-6be42f48b023" containerID="b066b91b18ccb5da1e9ab6c2316b6294409b3b18f2276443c434778248a1dc18" exitCode=0 Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.958405 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" event={"ID":"c7d3f2ef-022b-41f5-84e5-6be42f48b023","Type":"ContainerDied","Data":"b066b91b18ccb5da1e9ab6c2316b6294409b3b18f2276443c434778248a1dc18"} Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.958447 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" event={"ID":"c7d3f2ef-022b-41f5-84e5-6be42f48b023","Type":"ContainerDied","Data":"ff85a59df9a2319c7da2c78eb739b04593065f603d6b6b89e2d8f559dd9a137f"} Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.958528 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6w6sm" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.969773 4867 generic.go:334] "Generic (PLEG): container finished" podID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" containerID="9156f251355a6c80dd8a0a065c59e86c6aaf74000cc46e243116cbc7c06dac4a" exitCode=0 Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.970103 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pwxb4" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.970151 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwxb4" event={"ID":"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f","Type":"ContainerDied","Data":"9156f251355a6c80dd8a0a065c59e86c6aaf74000cc46e243116cbc7c06dac4a"} Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.971066 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwxb4" event={"ID":"8d9f8ccf-fbe9-42b0-84e5-b1913365a11f","Type":"ContainerDied","Data":"140d0e133ff553f42773c657ec6f6a5037a4e5efb5f3f4ced27e47afea6a621f"} Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.975042 4867 generic.go:334] "Generic (PLEG): container finished" podID="545d34e2-c5a8-48ab-9603-9ae4986ab739" containerID="29956ba0090af6092eaa556a0025578e62ca428457185a659067375b7932461a" exitCode=0 Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.975091 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nv8tc" event={"ID":"545d34e2-c5a8-48ab-9603-9ae4986ab739","Type":"ContainerDied","Data":"29956ba0090af6092eaa556a0025578e62ca428457185a659067375b7932461a"} Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.975116 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nv8tc" event={"ID":"545d34e2-c5a8-48ab-9603-9ae4986ab739","Type":"ContainerDied","Data":"67c4a8b5a285b99aa43f1325bc3ee24138ef713e7629c0b8b46aa67d2fc460b3"} Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.975194 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nv8tc" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.976951 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-znqxw"] Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.981452 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-znqxw"] Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.988031 4867 scope.go:117] "RemoveContainer" containerID="55fd3f479ad5a6ed9653ecf015decb675d4c42e5bc5b16d69de43109e51411e1" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.994798 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4702840c-d6fa-4dcd-bd95-4ac89f95d727-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4702840c-d6fa-4dcd-bd95-4ac89f95d727" (UID: "4702840c-d6fa-4dcd-bd95-4ac89f95d727"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:12:44 crc kubenswrapper[4867]: I1201 09:12:44.999548 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kfrqj"] Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.008067 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kfrqj"] Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.025274 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c347bb68-7140-4b8d-ae43-ee55b581c961-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.025314 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4702840c-d6fa-4dcd-bd95-4ac89f95d727-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.025323 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9f8ccf-fbe9-42b0-84e5-b1913365a11f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.025334 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/545d34e2-c5a8-48ab-9603-9ae4986ab739-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.025348 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65b95\" (UniqueName: \"kubernetes.io/projected/4702840c-d6fa-4dcd-bd95-4ac89f95d727-kube-api-access-65b95\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.025360 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z7p5\" (UniqueName: \"kubernetes.io/projected/c7d3f2ef-022b-41f5-84e5-6be42f48b023-kube-api-access-7z7p5\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.025369 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfsbs\" (UniqueName: \"kubernetes.io/projected/545d34e2-c5a8-48ab-9603-9ae4986ab739-kube-api-access-rfsbs\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.025378 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsjmq\" (UniqueName: \"kubernetes.io/projected/8d9f8ccf-fbe9-42b0-84e5-b1913365a11f-kube-api-access-fsjmq\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.025388 4867 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7d3f2ef-022b-41f5-84e5-6be42f48b023-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.025400 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/545d34e2-c5a8-48ab-9603-9ae4986ab739-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.025410 4867 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7d3f2ef-022b-41f5-84e5-6be42f48b023-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.025419 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghmkb\" (UniqueName: \"kubernetes.io/projected/c347bb68-7140-4b8d-ae43-ee55b581c961-kube-api-access-ghmkb\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.025431 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4702840c-d6fa-4dcd-bd95-4ac89f95d727-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.025939 4867 scope.go:117] "RemoveContainer" containerID="3a3583fd43edc7e45de2907c196a22858c6a71063fa347ad12de6fbc902e18fb" Dec 01 09:12:45 crc kubenswrapper[4867]: E1201 09:12:45.029604 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a3583fd43edc7e45de2907c196a22858c6a71063fa347ad12de6fbc902e18fb\": container with ID starting with 3a3583fd43edc7e45de2907c196a22858c6a71063fa347ad12de6fbc902e18fb not found: ID does not exist" containerID="3a3583fd43edc7e45de2907c196a22858c6a71063fa347ad12de6fbc902e18fb" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.029644 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a3583fd43edc7e45de2907c196a22858c6a71063fa347ad12de6fbc902e18fb"} err="failed to get container status \"3a3583fd43edc7e45de2907c196a22858c6a71063fa347ad12de6fbc902e18fb\": rpc error: code = NotFound desc = could not find container \"3a3583fd43edc7e45de2907c196a22858c6a71063fa347ad12de6fbc902e18fb\": container with ID starting with 3a3583fd43edc7e45de2907c196a22858c6a71063fa347ad12de6fbc902e18fb not found: ID does not exist" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.029678 4867 scope.go:117] "RemoveContainer" containerID="75f9ed8a8c3a401d40aa5a13a290ee3039cfade80d33de9ccf450491d0718a38" Dec 01 09:12:45 crc kubenswrapper[4867]: E1201 09:12:45.030415 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75f9ed8a8c3a401d40aa5a13a290ee3039cfade80d33de9ccf450491d0718a38\": container with ID starting with 75f9ed8a8c3a401d40aa5a13a290ee3039cfade80d33de9ccf450491d0718a38 not found: ID does not exist" containerID="75f9ed8a8c3a401d40aa5a13a290ee3039cfade80d33de9ccf450491d0718a38" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.030447 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75f9ed8a8c3a401d40aa5a13a290ee3039cfade80d33de9ccf450491d0718a38"} err="failed to get container status \"75f9ed8a8c3a401d40aa5a13a290ee3039cfade80d33de9ccf450491d0718a38\": rpc error: code = NotFound desc = could not find container \"75f9ed8a8c3a401d40aa5a13a290ee3039cfade80d33de9ccf450491d0718a38\": container with ID starting with 75f9ed8a8c3a401d40aa5a13a290ee3039cfade80d33de9ccf450491d0718a38 not found: ID does not exist" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.030461 4867 scope.go:117] "RemoveContainer" containerID="55fd3f479ad5a6ed9653ecf015decb675d4c42e5bc5b16d69de43109e51411e1" Dec 01 09:12:45 crc kubenswrapper[4867]: E1201 09:12:45.030795 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55fd3f479ad5a6ed9653ecf015decb675d4c42e5bc5b16d69de43109e51411e1\": container with ID starting with 55fd3f479ad5a6ed9653ecf015decb675d4c42e5bc5b16d69de43109e51411e1 not found: ID does not exist" containerID="55fd3f479ad5a6ed9653ecf015decb675d4c42e5bc5b16d69de43109e51411e1" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.030881 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55fd3f479ad5a6ed9653ecf015decb675d4c42e5bc5b16d69de43109e51411e1"} err="failed to get container status \"55fd3f479ad5a6ed9653ecf015decb675d4c42e5bc5b16d69de43109e51411e1\": rpc error: code = NotFound desc = could not find container \"55fd3f479ad5a6ed9653ecf015decb675d4c42e5bc5b16d69de43109e51411e1\": container with ID starting with 55fd3f479ad5a6ed9653ecf015decb675d4c42e5bc5b16d69de43109e51411e1 not found: ID does not exist" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.030933 4867 scope.go:117] "RemoveContainer" containerID="305f9637cc21bccb7f075fb3fb9e8a1c9c9912d07256a0e65d68bc36458df8f4" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.031459 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6w6sm"] Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.038343 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6w6sm"] Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.042115 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nv8tc"] Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.044728 4867 scope.go:117] "RemoveContainer" containerID="6d0646eb99c6744aa5bd47a62522e6e81f0d6e2ba6cae3077039620e8d829f87" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.046861 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nv8tc"] Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.057657 4867 scope.go:117] "RemoveContainer" containerID="e0321e082f0ccbb4f695fc11c61a8b8a4b3570e1b43d43d38853e5d84750ca20" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.060824 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d9f8ccf-fbe9-42b0-84e5-b1913365a11f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" (UID: "8d9f8ccf-fbe9-42b0-84e5-b1913365a11f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.064295 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c347bb68-7140-4b8d-ae43-ee55b581c961-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c347bb68-7140-4b8d-ae43-ee55b581c961" (UID: "c347bb68-7140-4b8d-ae43-ee55b581c961"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.069629 4867 scope.go:117] "RemoveContainer" containerID="305f9637cc21bccb7f075fb3fb9e8a1c9c9912d07256a0e65d68bc36458df8f4" Dec 01 09:12:45 crc kubenswrapper[4867]: E1201 09:12:45.070088 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"305f9637cc21bccb7f075fb3fb9e8a1c9c9912d07256a0e65d68bc36458df8f4\": container with ID starting with 305f9637cc21bccb7f075fb3fb9e8a1c9c9912d07256a0e65d68bc36458df8f4 not found: ID does not exist" containerID="305f9637cc21bccb7f075fb3fb9e8a1c9c9912d07256a0e65d68bc36458df8f4" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.070210 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"305f9637cc21bccb7f075fb3fb9e8a1c9c9912d07256a0e65d68bc36458df8f4"} err="failed to get container status \"305f9637cc21bccb7f075fb3fb9e8a1c9c9912d07256a0e65d68bc36458df8f4\": rpc error: code = NotFound desc = could not find container \"305f9637cc21bccb7f075fb3fb9e8a1c9c9912d07256a0e65d68bc36458df8f4\": container with ID starting with 305f9637cc21bccb7f075fb3fb9e8a1c9c9912d07256a0e65d68bc36458df8f4 not found: ID does not exist" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.070296 4867 scope.go:117] "RemoveContainer" containerID="6d0646eb99c6744aa5bd47a62522e6e81f0d6e2ba6cae3077039620e8d829f87" Dec 01 09:12:45 crc kubenswrapper[4867]: E1201 09:12:45.070719 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d0646eb99c6744aa5bd47a62522e6e81f0d6e2ba6cae3077039620e8d829f87\": container with ID starting with 6d0646eb99c6744aa5bd47a62522e6e81f0d6e2ba6cae3077039620e8d829f87 not found: ID does not exist" containerID="6d0646eb99c6744aa5bd47a62522e6e81f0d6e2ba6cae3077039620e8d829f87" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.070755 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0646eb99c6744aa5bd47a62522e6e81f0d6e2ba6cae3077039620e8d829f87"} err="failed to get container status \"6d0646eb99c6744aa5bd47a62522e6e81f0d6e2ba6cae3077039620e8d829f87\": rpc error: code = NotFound desc = could not find container \"6d0646eb99c6744aa5bd47a62522e6e81f0d6e2ba6cae3077039620e8d829f87\": container with ID starting with 6d0646eb99c6744aa5bd47a62522e6e81f0d6e2ba6cae3077039620e8d829f87 not found: ID does not exist" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.070778 4867 scope.go:117] "RemoveContainer" containerID="e0321e082f0ccbb4f695fc11c61a8b8a4b3570e1b43d43d38853e5d84750ca20" Dec 01 09:12:45 crc kubenswrapper[4867]: E1201 09:12:45.071057 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0321e082f0ccbb4f695fc11c61a8b8a4b3570e1b43d43d38853e5d84750ca20\": container with ID starting with e0321e082f0ccbb4f695fc11c61a8b8a4b3570e1b43d43d38853e5d84750ca20 not found: ID does not exist" containerID="e0321e082f0ccbb4f695fc11c61a8b8a4b3570e1b43d43d38853e5d84750ca20" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.071090 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0321e082f0ccbb4f695fc11c61a8b8a4b3570e1b43d43d38853e5d84750ca20"} err="failed to get container status \"e0321e082f0ccbb4f695fc11c61a8b8a4b3570e1b43d43d38853e5d84750ca20\": rpc error: code = NotFound desc = could not find container \"e0321e082f0ccbb4f695fc11c61a8b8a4b3570e1b43d43d38853e5d84750ca20\": container with ID starting with e0321e082f0ccbb4f695fc11c61a8b8a4b3570e1b43d43d38853e5d84750ca20 not found: ID does not exist" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.071118 4867 scope.go:117] "RemoveContainer" containerID="66f93ba6d91139b2365b929e38ed7b88f49f8584eb586157d59ace4ca2f72b4f" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.082884 4867 scope.go:117] "RemoveContainer" containerID="e37bf53feeae0109f765f211aa8ccfa5ae45fdbb92ef635f38b53feb5ed73e50" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.095035 4867 scope.go:117] "RemoveContainer" containerID="1e6a2cb21c7807d8ebddf2cc9d4b495ed8119c95e2512bbabaeab908c8f5c488" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.108907 4867 scope.go:117] "RemoveContainer" containerID="66f93ba6d91139b2365b929e38ed7b88f49f8584eb586157d59ace4ca2f72b4f" Dec 01 09:12:45 crc kubenswrapper[4867]: E1201 09:12:45.109319 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66f93ba6d91139b2365b929e38ed7b88f49f8584eb586157d59ace4ca2f72b4f\": container with ID starting with 66f93ba6d91139b2365b929e38ed7b88f49f8584eb586157d59ace4ca2f72b4f not found: ID does not exist" containerID="66f93ba6d91139b2365b929e38ed7b88f49f8584eb586157d59ace4ca2f72b4f" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.109360 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66f93ba6d91139b2365b929e38ed7b88f49f8584eb586157d59ace4ca2f72b4f"} err="failed to get container status \"66f93ba6d91139b2365b929e38ed7b88f49f8584eb586157d59ace4ca2f72b4f\": rpc error: code = NotFound desc = could not find container \"66f93ba6d91139b2365b929e38ed7b88f49f8584eb586157d59ace4ca2f72b4f\": container with ID starting with 66f93ba6d91139b2365b929e38ed7b88f49f8584eb586157d59ace4ca2f72b4f not found: ID does not exist" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.109390 4867 scope.go:117] "RemoveContainer" containerID="e37bf53feeae0109f765f211aa8ccfa5ae45fdbb92ef635f38b53feb5ed73e50" Dec 01 09:12:45 crc kubenswrapper[4867]: E1201 09:12:45.109771 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e37bf53feeae0109f765f211aa8ccfa5ae45fdbb92ef635f38b53feb5ed73e50\": container with ID starting with e37bf53feeae0109f765f211aa8ccfa5ae45fdbb92ef635f38b53feb5ed73e50 not found: ID does not exist" containerID="e37bf53feeae0109f765f211aa8ccfa5ae45fdbb92ef635f38b53feb5ed73e50" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.109802 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e37bf53feeae0109f765f211aa8ccfa5ae45fdbb92ef635f38b53feb5ed73e50"} err="failed to get container status \"e37bf53feeae0109f765f211aa8ccfa5ae45fdbb92ef635f38b53feb5ed73e50\": rpc error: code = NotFound desc = could not find container \"e37bf53feeae0109f765f211aa8ccfa5ae45fdbb92ef635f38b53feb5ed73e50\": container with ID starting with e37bf53feeae0109f765f211aa8ccfa5ae45fdbb92ef635f38b53feb5ed73e50 not found: ID does not exist" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.109848 4867 scope.go:117] "RemoveContainer" containerID="1e6a2cb21c7807d8ebddf2cc9d4b495ed8119c95e2512bbabaeab908c8f5c488" Dec 01 09:12:45 crc kubenswrapper[4867]: E1201 09:12:45.110148 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e6a2cb21c7807d8ebddf2cc9d4b495ed8119c95e2512bbabaeab908c8f5c488\": container with ID starting with 1e6a2cb21c7807d8ebddf2cc9d4b495ed8119c95e2512bbabaeab908c8f5c488 not found: ID does not exist" containerID="1e6a2cb21c7807d8ebddf2cc9d4b495ed8119c95e2512bbabaeab908c8f5c488" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.110171 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e6a2cb21c7807d8ebddf2cc9d4b495ed8119c95e2512bbabaeab908c8f5c488"} err="failed to get container status \"1e6a2cb21c7807d8ebddf2cc9d4b495ed8119c95e2512bbabaeab908c8f5c488\": rpc error: code = NotFound desc = could not find container \"1e6a2cb21c7807d8ebddf2cc9d4b495ed8119c95e2512bbabaeab908c8f5c488\": container with ID starting with 1e6a2cb21c7807d8ebddf2cc9d4b495ed8119c95e2512bbabaeab908c8f5c488 not found: ID does not exist" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.110186 4867 scope.go:117] "RemoveContainer" containerID="1d2d7d5a44d1d3a7123da638c13be7ad5c902e7bc8b2c338944b5dfa2f632e8a" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.120880 4867 scope.go:117] "RemoveContainer" containerID="bdbe5d0b60b731687241b6c5d9b4cf27ccd8234574bb80665f8fc56dec24dfa7" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.126937 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c347bb68-7140-4b8d-ae43-ee55b581c961-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.126969 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9f8ccf-fbe9-42b0-84e5-b1913365a11f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.135860 4867 scope.go:117] "RemoveContainer" containerID="a14e51519db9554d4ae504206d354ab02b314118dbb920f9e6474bb5d3ef0403" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.147775 4867 scope.go:117] "RemoveContainer" containerID="1d2d7d5a44d1d3a7123da638c13be7ad5c902e7bc8b2c338944b5dfa2f632e8a" Dec 01 09:12:45 crc kubenswrapper[4867]: E1201 09:12:45.148167 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d2d7d5a44d1d3a7123da638c13be7ad5c902e7bc8b2c338944b5dfa2f632e8a\": container with ID starting with 1d2d7d5a44d1d3a7123da638c13be7ad5c902e7bc8b2c338944b5dfa2f632e8a not found: ID does not exist" containerID="1d2d7d5a44d1d3a7123da638c13be7ad5c902e7bc8b2c338944b5dfa2f632e8a" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.148218 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d2d7d5a44d1d3a7123da638c13be7ad5c902e7bc8b2c338944b5dfa2f632e8a"} err="failed to get container status \"1d2d7d5a44d1d3a7123da638c13be7ad5c902e7bc8b2c338944b5dfa2f632e8a\": rpc error: code = NotFound desc = could not find container \"1d2d7d5a44d1d3a7123da638c13be7ad5c902e7bc8b2c338944b5dfa2f632e8a\": container with ID starting with 1d2d7d5a44d1d3a7123da638c13be7ad5c902e7bc8b2c338944b5dfa2f632e8a not found: ID does not exist" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.148253 4867 scope.go:117] "RemoveContainer" containerID="bdbe5d0b60b731687241b6c5d9b4cf27ccd8234574bb80665f8fc56dec24dfa7" Dec 01 09:12:45 crc kubenswrapper[4867]: E1201 09:12:45.148582 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdbe5d0b60b731687241b6c5d9b4cf27ccd8234574bb80665f8fc56dec24dfa7\": container with ID starting with bdbe5d0b60b731687241b6c5d9b4cf27ccd8234574bb80665f8fc56dec24dfa7 not found: ID does not exist" containerID="bdbe5d0b60b731687241b6c5d9b4cf27ccd8234574bb80665f8fc56dec24dfa7" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.148625 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdbe5d0b60b731687241b6c5d9b4cf27ccd8234574bb80665f8fc56dec24dfa7"} err="failed to get container status \"bdbe5d0b60b731687241b6c5d9b4cf27ccd8234574bb80665f8fc56dec24dfa7\": rpc error: code = NotFound desc = could not find container \"bdbe5d0b60b731687241b6c5d9b4cf27ccd8234574bb80665f8fc56dec24dfa7\": container with ID starting with bdbe5d0b60b731687241b6c5d9b4cf27ccd8234574bb80665f8fc56dec24dfa7 not found: ID does not exist" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.148654 4867 scope.go:117] "RemoveContainer" containerID="a14e51519db9554d4ae504206d354ab02b314118dbb920f9e6474bb5d3ef0403" Dec 01 09:12:45 crc kubenswrapper[4867]: E1201 09:12:45.149274 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a14e51519db9554d4ae504206d354ab02b314118dbb920f9e6474bb5d3ef0403\": container with ID starting with a14e51519db9554d4ae504206d354ab02b314118dbb920f9e6474bb5d3ef0403 not found: ID does not exist" containerID="a14e51519db9554d4ae504206d354ab02b314118dbb920f9e6474bb5d3ef0403" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.149331 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a14e51519db9554d4ae504206d354ab02b314118dbb920f9e6474bb5d3ef0403"} err="failed to get container status \"a14e51519db9554d4ae504206d354ab02b314118dbb920f9e6474bb5d3ef0403\": rpc error: code = NotFound desc = could not find container \"a14e51519db9554d4ae504206d354ab02b314118dbb920f9e6474bb5d3ef0403\": container with ID starting with a14e51519db9554d4ae504206d354ab02b314118dbb920f9e6474bb5d3ef0403 not found: ID does not exist" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.149348 4867 scope.go:117] "RemoveContainer" containerID="b066b91b18ccb5da1e9ab6c2316b6294409b3b18f2276443c434778248a1dc18" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.160311 4867 scope.go:117] "RemoveContainer" containerID="b066b91b18ccb5da1e9ab6c2316b6294409b3b18f2276443c434778248a1dc18" Dec 01 09:12:45 crc kubenswrapper[4867]: E1201 09:12:45.160640 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b066b91b18ccb5da1e9ab6c2316b6294409b3b18f2276443c434778248a1dc18\": container with ID starting with b066b91b18ccb5da1e9ab6c2316b6294409b3b18f2276443c434778248a1dc18 not found: ID does not exist" containerID="b066b91b18ccb5da1e9ab6c2316b6294409b3b18f2276443c434778248a1dc18" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.160674 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b066b91b18ccb5da1e9ab6c2316b6294409b3b18f2276443c434778248a1dc18"} err="failed to get container status \"b066b91b18ccb5da1e9ab6c2316b6294409b3b18f2276443c434778248a1dc18\": rpc error: code = NotFound desc = could not find container \"b066b91b18ccb5da1e9ab6c2316b6294409b3b18f2276443c434778248a1dc18\": container with ID starting with b066b91b18ccb5da1e9ab6c2316b6294409b3b18f2276443c434778248a1dc18 not found: ID does not exist" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.160725 4867 scope.go:117] "RemoveContainer" containerID="9156f251355a6c80dd8a0a065c59e86c6aaf74000cc46e243116cbc7c06dac4a" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.172803 4867 scope.go:117] "RemoveContainer" containerID="013e761cfc8bdf38b0755ec7e1d83a0633b23600e29bf8e95221bbd5983bc1fe" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.188616 4867 scope.go:117] "RemoveContainer" containerID="4ebc94b6d0bf73529fbfcfeaa585252e1ebaf9e3938f7ed203390fb9e097fd01" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.204506 4867 scope.go:117] "RemoveContainer" containerID="9156f251355a6c80dd8a0a065c59e86c6aaf74000cc46e243116cbc7c06dac4a" Dec 01 09:12:45 crc kubenswrapper[4867]: E1201 09:12:45.205327 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9156f251355a6c80dd8a0a065c59e86c6aaf74000cc46e243116cbc7c06dac4a\": container with ID starting with 9156f251355a6c80dd8a0a065c59e86c6aaf74000cc46e243116cbc7c06dac4a not found: ID does not exist" containerID="9156f251355a6c80dd8a0a065c59e86c6aaf74000cc46e243116cbc7c06dac4a" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.205359 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9156f251355a6c80dd8a0a065c59e86c6aaf74000cc46e243116cbc7c06dac4a"} err="failed to get container status \"9156f251355a6c80dd8a0a065c59e86c6aaf74000cc46e243116cbc7c06dac4a\": rpc error: code = NotFound desc = could not find container \"9156f251355a6c80dd8a0a065c59e86c6aaf74000cc46e243116cbc7c06dac4a\": container with ID starting with 9156f251355a6c80dd8a0a065c59e86c6aaf74000cc46e243116cbc7c06dac4a not found: ID does not exist" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.205383 4867 scope.go:117] "RemoveContainer" containerID="013e761cfc8bdf38b0755ec7e1d83a0633b23600e29bf8e95221bbd5983bc1fe" Dec 01 09:12:45 crc kubenswrapper[4867]: E1201 09:12:45.205934 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"013e761cfc8bdf38b0755ec7e1d83a0633b23600e29bf8e95221bbd5983bc1fe\": container with ID starting with 013e761cfc8bdf38b0755ec7e1d83a0633b23600e29bf8e95221bbd5983bc1fe not found: ID does not exist" containerID="013e761cfc8bdf38b0755ec7e1d83a0633b23600e29bf8e95221bbd5983bc1fe" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.205959 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"013e761cfc8bdf38b0755ec7e1d83a0633b23600e29bf8e95221bbd5983bc1fe"} err="failed to get container status \"013e761cfc8bdf38b0755ec7e1d83a0633b23600e29bf8e95221bbd5983bc1fe\": rpc error: code = NotFound desc = could not find container \"013e761cfc8bdf38b0755ec7e1d83a0633b23600e29bf8e95221bbd5983bc1fe\": container with ID starting with 013e761cfc8bdf38b0755ec7e1d83a0633b23600e29bf8e95221bbd5983bc1fe not found: ID does not exist" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.205977 4867 scope.go:117] "RemoveContainer" containerID="4ebc94b6d0bf73529fbfcfeaa585252e1ebaf9e3938f7ed203390fb9e097fd01" Dec 01 09:12:45 crc kubenswrapper[4867]: E1201 09:12:45.206224 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ebc94b6d0bf73529fbfcfeaa585252e1ebaf9e3938f7ed203390fb9e097fd01\": container with ID starting with 4ebc94b6d0bf73529fbfcfeaa585252e1ebaf9e3938f7ed203390fb9e097fd01 not found: ID does not exist" containerID="4ebc94b6d0bf73529fbfcfeaa585252e1ebaf9e3938f7ed203390fb9e097fd01" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.206252 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ebc94b6d0bf73529fbfcfeaa585252e1ebaf9e3938f7ed203390fb9e097fd01"} err="failed to get container status \"4ebc94b6d0bf73529fbfcfeaa585252e1ebaf9e3938f7ed203390fb9e097fd01\": rpc error: code = NotFound desc = could not find container \"4ebc94b6d0bf73529fbfcfeaa585252e1ebaf9e3938f7ed203390fb9e097fd01\": container with ID starting with 4ebc94b6d0bf73529fbfcfeaa585252e1ebaf9e3938f7ed203390fb9e097fd01 not found: ID does not exist" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.206265 4867 scope.go:117] "RemoveContainer" containerID="29956ba0090af6092eaa556a0025578e62ca428457185a659067375b7932461a" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.221345 4867 scope.go:117] "RemoveContainer" containerID="a30ffcf8b6e679e76e073d49fb72d6374e3447d0d5227e6d0ab0709d8d4ab15c" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.240231 4867 scope.go:117] "RemoveContainer" containerID="891697f4dd90138e0c27340381b9112b99bb78dd61819e53edccc2c5b4070a99" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.256615 4867 scope.go:117] "RemoveContainer" containerID="29956ba0090af6092eaa556a0025578e62ca428457185a659067375b7932461a" Dec 01 09:12:45 crc kubenswrapper[4867]: E1201 09:12:45.257274 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29956ba0090af6092eaa556a0025578e62ca428457185a659067375b7932461a\": container with ID starting with 29956ba0090af6092eaa556a0025578e62ca428457185a659067375b7932461a not found: ID does not exist" containerID="29956ba0090af6092eaa556a0025578e62ca428457185a659067375b7932461a" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.257350 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29956ba0090af6092eaa556a0025578e62ca428457185a659067375b7932461a"} err="failed to get container status \"29956ba0090af6092eaa556a0025578e62ca428457185a659067375b7932461a\": rpc error: code = NotFound desc = could not find container \"29956ba0090af6092eaa556a0025578e62ca428457185a659067375b7932461a\": container with ID starting with 29956ba0090af6092eaa556a0025578e62ca428457185a659067375b7932461a not found: ID does not exist" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.257407 4867 scope.go:117] "RemoveContainer" containerID="a30ffcf8b6e679e76e073d49fb72d6374e3447d0d5227e6d0ab0709d8d4ab15c" Dec 01 09:12:45 crc kubenswrapper[4867]: E1201 09:12:45.257979 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a30ffcf8b6e679e76e073d49fb72d6374e3447d0d5227e6d0ab0709d8d4ab15c\": container with ID starting with a30ffcf8b6e679e76e073d49fb72d6374e3447d0d5227e6d0ab0709d8d4ab15c not found: ID does not exist" containerID="a30ffcf8b6e679e76e073d49fb72d6374e3447d0d5227e6d0ab0709d8d4ab15c" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.258336 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a30ffcf8b6e679e76e073d49fb72d6374e3447d0d5227e6d0ab0709d8d4ab15c"} err="failed to get container status \"a30ffcf8b6e679e76e073d49fb72d6374e3447d0d5227e6d0ab0709d8d4ab15c\": rpc error: code = NotFound desc = could not find container \"a30ffcf8b6e679e76e073d49fb72d6374e3447d0d5227e6d0ab0709d8d4ab15c\": container with ID starting with a30ffcf8b6e679e76e073d49fb72d6374e3447d0d5227e6d0ab0709d8d4ab15c not found: ID does not exist" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.258395 4867 scope.go:117] "RemoveContainer" containerID="891697f4dd90138e0c27340381b9112b99bb78dd61819e53edccc2c5b4070a99" Dec 01 09:12:45 crc kubenswrapper[4867]: E1201 09:12:45.260110 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"891697f4dd90138e0c27340381b9112b99bb78dd61819e53edccc2c5b4070a99\": container with ID starting with 891697f4dd90138e0c27340381b9112b99bb78dd61819e53edccc2c5b4070a99 not found: ID does not exist" containerID="891697f4dd90138e0c27340381b9112b99bb78dd61819e53edccc2c5b4070a99" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.260182 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"891697f4dd90138e0c27340381b9112b99bb78dd61819e53edccc2c5b4070a99"} err="failed to get container status \"891697f4dd90138e0c27340381b9112b99bb78dd61819e53edccc2c5b4070a99\": rpc error: code = NotFound desc = could not find container \"891697f4dd90138e0c27340381b9112b99bb78dd61819e53edccc2c5b4070a99\": container with ID starting with 891697f4dd90138e0c27340381b9112b99bb78dd61819e53edccc2c5b4070a99 not found: ID does not exist" Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.267869 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zcq6l"] Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.272025 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zcq6l"] Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.279795 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5wrtl"] Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.284004 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5wrtl"] Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.299573 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pwxb4"] Dec 01 09:12:45 crc kubenswrapper[4867]: I1201 09:12:45.303034 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pwxb4"] Dec 01 09:12:46 crc kubenswrapper[4867]: I1201 09:12:46.834240 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4702840c-d6fa-4dcd-bd95-4ac89f95d727" path="/var/lib/kubelet/pods/4702840c-d6fa-4dcd-bd95-4ac89f95d727/volumes" Dec 01 09:12:46 crc kubenswrapper[4867]: I1201 09:12:46.835401 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="545d34e2-c5a8-48ab-9603-9ae4986ab739" path="/var/lib/kubelet/pods/545d34e2-c5a8-48ab-9603-9ae4986ab739/volumes" Dec 01 09:12:46 crc kubenswrapper[4867]: I1201 09:12:46.836268 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" path="/var/lib/kubelet/pods/5a77c808-2899-42d0-95e6-72f00df5432f/volumes" Dec 01 09:12:46 crc kubenswrapper[4867]: I1201 09:12:46.837600 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" path="/var/lib/kubelet/pods/8d9f8ccf-fbe9-42b0-84e5-b1913365a11f/volumes" Dec 01 09:12:46 crc kubenswrapper[4867]: I1201 09:12:46.838411 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c347bb68-7140-4b8d-ae43-ee55b581c961" path="/var/lib/kubelet/pods/c347bb68-7140-4b8d-ae43-ee55b581c961/volumes" Dec 01 09:12:46 crc kubenswrapper[4867]: I1201 09:12:46.839872 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d3f2ef-022b-41f5-84e5-6be42f48b023" path="/var/lib/kubelet/pods/c7d3f2ef-022b-41f5-84e5-6be42f48b023/volumes" Dec 01 09:12:46 crc kubenswrapper[4867]: I1201 09:12:46.840542 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" path="/var/lib/kubelet/pods/d6f7e7b3-ad0a-41ab-8291-c5200fe31a88/volumes" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.835380 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7gpdj"] Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836411 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="545d34e2-c5a8-48ab-9603-9ae4986ab739" containerName="registry-server" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836444 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="545d34e2-c5a8-48ab-9603-9ae4986ab739" containerName="registry-server" Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836460 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836467 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836479 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55c2a62-535e-4781-a610-4eeea00a871c" containerName="installer" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836486 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55c2a62-535e-4781-a610-4eeea00a871c" containerName="installer" Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836496 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" containerName="extract-content" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836504 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" containerName="extract-content" Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836515 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" containerName="registry-server" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836521 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" containerName="registry-server" Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836533 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c347bb68-7140-4b8d-ae43-ee55b581c961" containerName="extract-content" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836539 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c347bb68-7140-4b8d-ae43-ee55b581c961" containerName="extract-content" Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836547 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" containerName="registry-server" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836553 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" containerName="registry-server" Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836568 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" containerName="extract-utilities" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836580 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" containerName="extract-utilities" Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836592 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4702840c-d6fa-4dcd-bd95-4ac89f95d727" containerName="extract-utilities" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836601 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4702840c-d6fa-4dcd-bd95-4ac89f95d727" containerName="extract-utilities" Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836610 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" containerName="registry-server" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836616 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" containerName="registry-server" Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836623 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="545d34e2-c5a8-48ab-9603-9ae4986ab739" containerName="extract-utilities" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836632 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="545d34e2-c5a8-48ab-9603-9ae4986ab739" containerName="extract-utilities" Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836643 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c347bb68-7140-4b8d-ae43-ee55b581c961" containerName="extract-utilities" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836650 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c347bb68-7140-4b8d-ae43-ee55b581c961" containerName="extract-utilities" Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836662 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" containerName="extract-utilities" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836668 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" containerName="extract-utilities" Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836680 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" containerName="extract-content" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836685 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" containerName="extract-content" Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836697 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c347bb68-7140-4b8d-ae43-ee55b581c961" containerName="registry-server" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836704 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c347bb68-7140-4b8d-ae43-ee55b581c961" containerName="registry-server" Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836717 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d3f2ef-022b-41f5-84e5-6be42f48b023" containerName="marketplace-operator" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836724 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d3f2ef-022b-41f5-84e5-6be42f48b023" containerName="marketplace-operator" Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836734 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4702840c-d6fa-4dcd-bd95-4ac89f95d727" containerName="extract-content" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836741 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4702840c-d6fa-4dcd-bd95-4ac89f95d727" containerName="extract-content" Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836754 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4702840c-d6fa-4dcd-bd95-4ac89f95d727" containerName="registry-server" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836760 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4702840c-d6fa-4dcd-bd95-4ac89f95d727" containerName="registry-server" Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836771 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" containerName="extract-utilities" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836777 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" containerName="extract-utilities" Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836787 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" containerName="extract-content" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836793 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" containerName="extract-content" Dec 01 09:12:57 crc kubenswrapper[4867]: E1201 09:12:57.836801 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="545d34e2-c5a8-48ab-9603-9ae4986ab739" containerName="extract-content" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.836824 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="545d34e2-c5a8-48ab-9603-9ae4986ab739" containerName="extract-content" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.837016 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f7e7b3-ad0a-41ab-8291-c5200fe31a88" containerName="registry-server" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.837043 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c55c2a62-535e-4781-a610-4eeea00a871c" containerName="installer" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.837059 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c347bb68-7140-4b8d-ae43-ee55b581c961" containerName="registry-server" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.837074 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a77c808-2899-42d0-95e6-72f00df5432f" containerName="registry-server" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.837084 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.837100 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d3f2ef-022b-41f5-84e5-6be42f48b023" containerName="marketplace-operator" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.837108 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="545d34e2-c5a8-48ab-9603-9ae4986ab739" containerName="registry-server" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.837123 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d9f8ccf-fbe9-42b0-84e5-b1913365a11f" containerName="registry-server" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.837132 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="4702840c-d6fa-4dcd-bd95-4ac89f95d727" containerName="registry-server" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.837720 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7gpdj" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.852696 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.853080 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.853090 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.858187 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.863672 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.889792 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7gpdj"] Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.992119 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqqnn\" (UniqueName: \"kubernetes.io/projected/a222161f-afcc-47dc-bc2f-50b228543866-kube-api-access-pqqnn\") pod \"marketplace-operator-79b997595-7gpdj\" (UID: \"a222161f-afcc-47dc-bc2f-50b228543866\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gpdj" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.992460 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a222161f-afcc-47dc-bc2f-50b228543866-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7gpdj\" (UID: \"a222161f-afcc-47dc-bc2f-50b228543866\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gpdj" Dec 01 09:12:57 crc kubenswrapper[4867]: I1201 09:12:57.992651 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a222161f-afcc-47dc-bc2f-50b228543866-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7gpdj\" (UID: \"a222161f-afcc-47dc-bc2f-50b228543866\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gpdj" Dec 01 09:12:58 crc kubenswrapper[4867]: I1201 09:12:58.093445 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a222161f-afcc-47dc-bc2f-50b228543866-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7gpdj\" (UID: \"a222161f-afcc-47dc-bc2f-50b228543866\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gpdj" Dec 01 09:12:58 crc kubenswrapper[4867]: I1201 09:12:58.093798 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqqnn\" (UniqueName: \"kubernetes.io/projected/a222161f-afcc-47dc-bc2f-50b228543866-kube-api-access-pqqnn\") pod \"marketplace-operator-79b997595-7gpdj\" (UID: \"a222161f-afcc-47dc-bc2f-50b228543866\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gpdj" Dec 01 09:12:58 crc kubenswrapper[4867]: I1201 09:12:58.093850 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a222161f-afcc-47dc-bc2f-50b228543866-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7gpdj\" (UID: \"a222161f-afcc-47dc-bc2f-50b228543866\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gpdj" Dec 01 09:12:58 crc kubenswrapper[4867]: I1201 09:12:58.095159 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a222161f-afcc-47dc-bc2f-50b228543866-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7gpdj\" (UID: \"a222161f-afcc-47dc-bc2f-50b228543866\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gpdj" Dec 01 09:12:58 crc kubenswrapper[4867]: I1201 09:12:58.099555 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a222161f-afcc-47dc-bc2f-50b228543866-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7gpdj\" (UID: \"a222161f-afcc-47dc-bc2f-50b228543866\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gpdj" Dec 01 09:12:58 crc kubenswrapper[4867]: I1201 09:12:58.108970 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqqnn\" (UniqueName: \"kubernetes.io/projected/a222161f-afcc-47dc-bc2f-50b228543866-kube-api-access-pqqnn\") pod \"marketplace-operator-79b997595-7gpdj\" (UID: \"a222161f-afcc-47dc-bc2f-50b228543866\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gpdj" Dec 01 09:12:58 crc kubenswrapper[4867]: I1201 09:12:58.169915 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7gpdj" Dec 01 09:12:58 crc kubenswrapper[4867]: I1201 09:12:58.542056 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7gpdj"] Dec 01 09:12:59 crc kubenswrapper[4867]: I1201 09:12:59.069484 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7gpdj" event={"ID":"a222161f-afcc-47dc-bc2f-50b228543866","Type":"ContainerStarted","Data":"57532f27f7c19fba6b5ccb17274289586c9186d3df1b753995b2e85ec102735e"} Dec 01 09:12:59 crc kubenswrapper[4867]: I1201 09:12:59.069771 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7gpdj" Dec 01 09:12:59 crc kubenswrapper[4867]: I1201 09:12:59.069782 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7gpdj" event={"ID":"a222161f-afcc-47dc-bc2f-50b228543866","Type":"ContainerStarted","Data":"a4c2600bcdaa6b2a871241bc09639bec0f5eb7029f96e39ae720e65a6c26c27d"} Dec 01 09:12:59 crc kubenswrapper[4867]: I1201 09:12:59.071245 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7gpdj" Dec 01 09:12:59 crc kubenswrapper[4867]: I1201 09:12:59.090867 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7gpdj" podStartSLOduration=2.09084405 podStartE2EDuration="2.09084405s" podCreationTimestamp="2025-12-01 09:12:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:12:59.090551722 +0000 UTC m=+300.549938476" watchObservedRunningTime="2025-12-01 09:12:59.09084405 +0000 UTC m=+300.550230814" Dec 01 09:13:07 crc kubenswrapper[4867]: I1201 09:13:07.118091 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 01 09:13:07 crc kubenswrapper[4867]: I1201 09:13:07.121447 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 09:13:07 crc kubenswrapper[4867]: I1201 09:13:07.121517 4867 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="3bef36a5bcde2a09452d7fa49400a4f9ee7e5a03ebf88bca6dc4a8d58afde9ec" exitCode=137 Dec 01 09:13:07 crc kubenswrapper[4867]: I1201 09:13:07.121570 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"3bef36a5bcde2a09452d7fa49400a4f9ee7e5a03ebf88bca6dc4a8d58afde9ec"} Dec 01 09:13:07 crc kubenswrapper[4867]: I1201 09:13:07.121652 4867 scope.go:117] "RemoveContainer" containerID="86eb8cdc3577de5201dcb22e5e95a3ac03eb55c942febb9dcc74742e84156b08" Dec 01 09:13:08 crc kubenswrapper[4867]: I1201 09:13:08.128402 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 01 09:13:09 crc kubenswrapper[4867]: I1201 09:13:09.138105 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 01 09:13:09 crc kubenswrapper[4867]: I1201 09:13:09.140243 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3f6a7724faffd088d3b2567274d9e2aecf8cafb549fd808112052bc2038c4c94"} Dec 01 09:13:16 crc kubenswrapper[4867]: I1201 09:13:16.970883 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:13:16 crc kubenswrapper[4867]: I1201 09:13:16.976730 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:13:17 crc kubenswrapper[4867]: I1201 09:13:17.180629 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:13:18 crc kubenswrapper[4867]: I1201 09:13:18.188650 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:13:28 crc kubenswrapper[4867]: I1201 09:13:28.860908 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7"] Dec 01 09:13:28 crc kubenswrapper[4867]: I1201 09:13:28.861771 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" podUID="0b30e79f-de92-4b18-8b47-31cd45e753f1" containerName="route-controller-manager" containerID="cri-o://1665cf0223812cc9388614badc6496840c24c829fb8af49496e17c7ff8f86ee5" gracePeriod=30 Dec 01 09:13:28 crc kubenswrapper[4867]: I1201 09:13:28.865703 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-g8jfw"] Dec 01 09:13:28 crc kubenswrapper[4867]: I1201 09:13:28.866103 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" podUID="bc693be6-558a-41e9-96cd-40061ff9ae5d" containerName="controller-manager" containerID="cri-o://b23ee4aeebdadf10b06973288177e3a60db71204a2cd0d0fee5cb80ea87c28a4" gracePeriod=30 Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.244501 4867 generic.go:334] "Generic (PLEG): container finished" podID="0b30e79f-de92-4b18-8b47-31cd45e753f1" containerID="1665cf0223812cc9388614badc6496840c24c829fb8af49496e17c7ff8f86ee5" exitCode=0 Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.244639 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" event={"ID":"0b30e79f-de92-4b18-8b47-31cd45e753f1","Type":"ContainerDied","Data":"1665cf0223812cc9388614badc6496840c24c829fb8af49496e17c7ff8f86ee5"} Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.244894 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" event={"ID":"0b30e79f-de92-4b18-8b47-31cd45e753f1","Type":"ContainerDied","Data":"f35a74beeca72b2818af8e39226899c048dd1428ac7aba6d3aed41d4552cc761"} Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.244913 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f35a74beeca72b2818af8e39226899c048dd1428ac7aba6d3aed41d4552cc761" Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.246739 4867 generic.go:334] "Generic (PLEG): container finished" podID="bc693be6-558a-41e9-96cd-40061ff9ae5d" containerID="b23ee4aeebdadf10b06973288177e3a60db71204a2cd0d0fee5cb80ea87c28a4" exitCode=0 Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.246772 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" event={"ID":"bc693be6-558a-41e9-96cd-40061ff9ae5d","Type":"ContainerDied","Data":"b23ee4aeebdadf10b06973288177e3a60db71204a2cd0d0fee5cb80ea87c28a4"} Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.254613 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.318732 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.425867 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl9k6\" (UniqueName: \"kubernetes.io/projected/0b30e79f-de92-4b18-8b47-31cd45e753f1-kube-api-access-nl9k6\") pod \"0b30e79f-de92-4b18-8b47-31cd45e753f1\" (UID: \"0b30e79f-de92-4b18-8b47-31cd45e753f1\") " Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.425908 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc693be6-558a-41e9-96cd-40061ff9ae5d-proxy-ca-bundles\") pod \"bc693be6-558a-41e9-96cd-40061ff9ae5d\" (UID: \"bc693be6-558a-41e9-96cd-40061ff9ae5d\") " Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.425926 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b30e79f-de92-4b18-8b47-31cd45e753f1-config\") pod \"0b30e79f-de92-4b18-8b47-31cd45e753f1\" (UID: \"0b30e79f-de92-4b18-8b47-31cd45e753f1\") " Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.425944 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68ns4\" (UniqueName: \"kubernetes.io/projected/bc693be6-558a-41e9-96cd-40061ff9ae5d-kube-api-access-68ns4\") pod \"bc693be6-558a-41e9-96cd-40061ff9ae5d\" (UID: \"bc693be6-558a-41e9-96cd-40061ff9ae5d\") " Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.425968 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc693be6-558a-41e9-96cd-40061ff9ae5d-client-ca\") pod \"bc693be6-558a-41e9-96cd-40061ff9ae5d\" (UID: \"bc693be6-558a-41e9-96cd-40061ff9ae5d\") " Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.425991 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b30e79f-de92-4b18-8b47-31cd45e753f1-serving-cert\") pod \"0b30e79f-de92-4b18-8b47-31cd45e753f1\" (UID: \"0b30e79f-de92-4b18-8b47-31cd45e753f1\") " Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.426034 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b30e79f-de92-4b18-8b47-31cd45e753f1-client-ca\") pod \"0b30e79f-de92-4b18-8b47-31cd45e753f1\" (UID: \"0b30e79f-de92-4b18-8b47-31cd45e753f1\") " Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.426088 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc693be6-558a-41e9-96cd-40061ff9ae5d-serving-cert\") pod \"bc693be6-558a-41e9-96cd-40061ff9ae5d\" (UID: \"bc693be6-558a-41e9-96cd-40061ff9ae5d\") " Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.426118 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc693be6-558a-41e9-96cd-40061ff9ae5d-config\") pod \"bc693be6-558a-41e9-96cd-40061ff9ae5d\" (UID: \"bc693be6-558a-41e9-96cd-40061ff9ae5d\") " Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.426966 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc693be6-558a-41e9-96cd-40061ff9ae5d-config" (OuterVolumeSpecName: "config") pod "bc693be6-558a-41e9-96cd-40061ff9ae5d" (UID: "bc693be6-558a-41e9-96cd-40061ff9ae5d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.427079 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc693be6-558a-41e9-96cd-40061ff9ae5d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "bc693be6-558a-41e9-96cd-40061ff9ae5d" (UID: "bc693be6-558a-41e9-96cd-40061ff9ae5d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.427394 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b30e79f-de92-4b18-8b47-31cd45e753f1-config" (OuterVolumeSpecName: "config") pod "0b30e79f-de92-4b18-8b47-31cd45e753f1" (UID: "0b30e79f-de92-4b18-8b47-31cd45e753f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.427471 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b30e79f-de92-4b18-8b47-31cd45e753f1-client-ca" (OuterVolumeSpecName: "client-ca") pod "0b30e79f-de92-4b18-8b47-31cd45e753f1" (UID: "0b30e79f-de92-4b18-8b47-31cd45e753f1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.428672 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc693be6-558a-41e9-96cd-40061ff9ae5d-client-ca" (OuterVolumeSpecName: "client-ca") pod "bc693be6-558a-41e9-96cd-40061ff9ae5d" (UID: "bc693be6-558a-41e9-96cd-40061ff9ae5d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.431829 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b30e79f-de92-4b18-8b47-31cd45e753f1-kube-api-access-nl9k6" (OuterVolumeSpecName: "kube-api-access-nl9k6") pod "0b30e79f-de92-4b18-8b47-31cd45e753f1" (UID: "0b30e79f-de92-4b18-8b47-31cd45e753f1"). InnerVolumeSpecName "kube-api-access-nl9k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.431922 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc693be6-558a-41e9-96cd-40061ff9ae5d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc693be6-558a-41e9-96cd-40061ff9ae5d" (UID: "bc693be6-558a-41e9-96cd-40061ff9ae5d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.432471 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b30e79f-de92-4b18-8b47-31cd45e753f1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b30e79f-de92-4b18-8b47-31cd45e753f1" (UID: "0b30e79f-de92-4b18-8b47-31cd45e753f1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.432528 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc693be6-558a-41e9-96cd-40061ff9ae5d-kube-api-access-68ns4" (OuterVolumeSpecName: "kube-api-access-68ns4") pod "bc693be6-558a-41e9-96cd-40061ff9ae5d" (UID: "bc693be6-558a-41e9-96cd-40061ff9ae5d"). InnerVolumeSpecName "kube-api-access-68ns4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.527569 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc693be6-558a-41e9-96cd-40061ff9ae5d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.527605 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc693be6-558a-41e9-96cd-40061ff9ae5d-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.527614 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl9k6\" (UniqueName: \"kubernetes.io/projected/0b30e79f-de92-4b18-8b47-31cd45e753f1-kube-api-access-nl9k6\") on node \"crc\" DevicePath \"\"" Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.527624 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68ns4\" (UniqueName: \"kubernetes.io/projected/bc693be6-558a-41e9-96cd-40061ff9ae5d-kube-api-access-68ns4\") on node \"crc\" DevicePath \"\"" Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.527632 4867 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc693be6-558a-41e9-96cd-40061ff9ae5d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.527640 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b30e79f-de92-4b18-8b47-31cd45e753f1-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.527648 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc693be6-558a-41e9-96cd-40061ff9ae5d-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.527657 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b30e79f-de92-4b18-8b47-31cd45e753f1-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:13:29 crc kubenswrapper[4867]: I1201 09:13:29.527667 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b30e79f-de92-4b18-8b47-31cd45e753f1-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.254245 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.257032 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.257326 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-g8jfw" event={"ID":"bc693be6-558a-41e9-96cd-40061ff9ae5d","Type":"ContainerDied","Data":"20aefd935014f46d0cfa81d7c2d4350472d82d5d470c1167a6da74bf5a2026f7"} Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.257384 4867 scope.go:117] "RemoveContainer" containerID="b23ee4aeebdadf10b06973288177e3a60db71204a2cd0d0fee5cb80ea87c28a4" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.280383 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7"] Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.284877 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vl9g7"] Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.292601 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-g8jfw"] Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.296100 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-g8jfw"] Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.350154 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x"] Dec 01 09:13:30 crc kubenswrapper[4867]: E1201 09:13:30.350331 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc693be6-558a-41e9-96cd-40061ff9ae5d" containerName="controller-manager" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.350342 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc693be6-558a-41e9-96cd-40061ff9ae5d" containerName="controller-manager" Dec 01 09:13:30 crc kubenswrapper[4867]: E1201 09:13:30.350352 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b30e79f-de92-4b18-8b47-31cd45e753f1" containerName="route-controller-manager" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.350358 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b30e79f-de92-4b18-8b47-31cd45e753f1" containerName="route-controller-manager" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.350427 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b30e79f-de92-4b18-8b47-31cd45e753f1" containerName="route-controller-manager" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.350443 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc693be6-558a-41e9-96cd-40061ff9ae5d" containerName="controller-manager" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.350739 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.352566 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.352673 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.352777 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.353231 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.353417 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.354013 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.354261 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-58bf4dd977-w7pkh"] Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.354702 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.359456 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.359794 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.361318 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.361552 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.361996 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.366524 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x"] Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.366708 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.371230 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.373617 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58bf4dd977-w7pkh"] Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.538709 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-serving-cert\") pod \"controller-manager-58bf4dd977-w7pkh\" (UID: \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\") " pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.538759 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-proxy-ca-bundles\") pod \"controller-manager-58bf4dd977-w7pkh\" (UID: \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\") " pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.538781 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbjd6\" (UniqueName: \"kubernetes.io/projected/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-kube-api-access-dbjd6\") pod \"controller-manager-58bf4dd977-w7pkh\" (UID: \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\") " pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.538930 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-config\") pod \"controller-manager-58bf4dd977-w7pkh\" (UID: \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\") " pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.538951 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddbe0ec4-87e3-4c89-8648-f17e661f6111-serving-cert\") pod \"route-controller-manager-7cbb74dfd9-cmv6x\" (UID: \"ddbe0ec4-87e3-4c89-8648-f17e661f6111\") " pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.538970 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddbe0ec4-87e3-4c89-8648-f17e661f6111-config\") pod \"route-controller-manager-7cbb74dfd9-cmv6x\" (UID: \"ddbe0ec4-87e3-4c89-8648-f17e661f6111\") " pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.539019 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddbe0ec4-87e3-4c89-8648-f17e661f6111-client-ca\") pod \"route-controller-manager-7cbb74dfd9-cmv6x\" (UID: \"ddbe0ec4-87e3-4c89-8648-f17e661f6111\") " pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.539047 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqw82\" (UniqueName: \"kubernetes.io/projected/ddbe0ec4-87e3-4c89-8648-f17e661f6111-kube-api-access-hqw82\") pod \"route-controller-manager-7cbb74dfd9-cmv6x\" (UID: \"ddbe0ec4-87e3-4c89-8648-f17e661f6111\") " pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.539085 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-client-ca\") pod \"controller-manager-58bf4dd977-w7pkh\" (UID: \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\") " pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.640629 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-serving-cert\") pod \"controller-manager-58bf4dd977-w7pkh\" (UID: \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\") " pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.640697 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-proxy-ca-bundles\") pod \"controller-manager-58bf4dd977-w7pkh\" (UID: \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\") " pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.640724 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbjd6\" (UniqueName: \"kubernetes.io/projected/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-kube-api-access-dbjd6\") pod \"controller-manager-58bf4dd977-w7pkh\" (UID: \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\") " pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.640764 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-config\") pod \"controller-manager-58bf4dd977-w7pkh\" (UID: \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\") " pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.640789 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddbe0ec4-87e3-4c89-8648-f17e661f6111-serving-cert\") pod \"route-controller-manager-7cbb74dfd9-cmv6x\" (UID: \"ddbe0ec4-87e3-4c89-8648-f17e661f6111\") " pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.640837 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddbe0ec4-87e3-4c89-8648-f17e661f6111-config\") pod \"route-controller-manager-7cbb74dfd9-cmv6x\" (UID: \"ddbe0ec4-87e3-4c89-8648-f17e661f6111\") " pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.640876 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddbe0ec4-87e3-4c89-8648-f17e661f6111-client-ca\") pod \"route-controller-manager-7cbb74dfd9-cmv6x\" (UID: \"ddbe0ec4-87e3-4c89-8648-f17e661f6111\") " pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.640902 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqw82\" (UniqueName: \"kubernetes.io/projected/ddbe0ec4-87e3-4c89-8648-f17e661f6111-kube-api-access-hqw82\") pod \"route-controller-manager-7cbb74dfd9-cmv6x\" (UID: \"ddbe0ec4-87e3-4c89-8648-f17e661f6111\") " pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.640938 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-client-ca\") pod \"controller-manager-58bf4dd977-w7pkh\" (UID: \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\") " pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.642002 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-client-ca\") pod \"controller-manager-58bf4dd977-w7pkh\" (UID: \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\") " pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.642115 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddbe0ec4-87e3-4c89-8648-f17e661f6111-config\") pod \"route-controller-manager-7cbb74dfd9-cmv6x\" (UID: \"ddbe0ec4-87e3-4c89-8648-f17e661f6111\") " pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.642240 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddbe0ec4-87e3-4c89-8648-f17e661f6111-client-ca\") pod \"route-controller-manager-7cbb74dfd9-cmv6x\" (UID: \"ddbe0ec4-87e3-4c89-8648-f17e661f6111\") " pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.642474 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-config\") pod \"controller-manager-58bf4dd977-w7pkh\" (UID: \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\") " pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.643157 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-proxy-ca-bundles\") pod \"controller-manager-58bf4dd977-w7pkh\" (UID: \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\") " pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.645901 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-serving-cert\") pod \"controller-manager-58bf4dd977-w7pkh\" (UID: \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\") " pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.645914 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddbe0ec4-87e3-4c89-8648-f17e661f6111-serving-cert\") pod \"route-controller-manager-7cbb74dfd9-cmv6x\" (UID: \"ddbe0ec4-87e3-4c89-8648-f17e661f6111\") " pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.666496 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbjd6\" (UniqueName: \"kubernetes.io/projected/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-kube-api-access-dbjd6\") pod \"controller-manager-58bf4dd977-w7pkh\" (UID: \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\") " pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.670964 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqw82\" (UniqueName: \"kubernetes.io/projected/ddbe0ec4-87e3-4c89-8648-f17e661f6111-kube-api-access-hqw82\") pod \"route-controller-manager-7cbb74dfd9-cmv6x\" (UID: \"ddbe0ec4-87e3-4c89-8648-f17e661f6111\") " pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.679709 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.835748 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b30e79f-de92-4b18-8b47-31cd45e753f1" path="/var/lib/kubelet/pods/0b30e79f-de92-4b18-8b47-31cd45e753f1/volumes" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.836856 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc693be6-558a-41e9-96cd-40061ff9ae5d" path="/var/lib/kubelet/pods/bc693be6-558a-41e9-96cd-40061ff9ae5d/volumes" Dec 01 09:13:30 crc kubenswrapper[4867]: I1201 09:13:30.969336 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" Dec 01 09:13:31 crc kubenswrapper[4867]: I1201 09:13:31.095571 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58bf4dd977-w7pkh"] Dec 01 09:13:31 crc kubenswrapper[4867]: I1201 09:13:31.175724 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x"] Dec 01 09:13:31 crc kubenswrapper[4867]: I1201 09:13:31.261248 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" event={"ID":"01bfe5d2-9117-4932-b6d7-5eeeb790ce79","Type":"ContainerStarted","Data":"0bea9190f23694740cead4e4bf2eb9868c68747d21be36beee24a23dcb5f8e63"} Dec 01 09:13:31 crc kubenswrapper[4867]: I1201 09:13:31.262532 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" event={"ID":"ddbe0ec4-87e3-4c89-8648-f17e661f6111","Type":"ContainerStarted","Data":"f8b48e8146bc82862c69ce0e3b5906739f881cb61f398bab8d857f0e63336ec2"} Dec 01 09:13:32 crc kubenswrapper[4867]: I1201 09:13:32.272683 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" event={"ID":"01bfe5d2-9117-4932-b6d7-5eeeb790ce79","Type":"ContainerStarted","Data":"521e1080c8cd80d9daa0a4690ee2aff081b5b548327dd8789c2ec6033086ecb3"} Dec 01 09:13:32 crc kubenswrapper[4867]: I1201 09:13:32.273046 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:13:32 crc kubenswrapper[4867]: I1201 09:13:32.273957 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" event={"ID":"ddbe0ec4-87e3-4c89-8648-f17e661f6111","Type":"ContainerStarted","Data":"adb2ad2c54e778790a73b05679482b1311ebf7666b3bf2e0e35c56f2d038c5ca"} Dec 01 09:13:32 crc kubenswrapper[4867]: I1201 09:13:32.274327 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" Dec 01 09:13:32 crc kubenswrapper[4867]: I1201 09:13:32.279826 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" Dec 01 09:13:32 crc kubenswrapper[4867]: I1201 09:13:32.280899 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:13:32 crc kubenswrapper[4867]: I1201 09:13:32.293084 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" podStartSLOduration=4.293063916 podStartE2EDuration="4.293063916s" podCreationTimestamp="2025-12-01 09:13:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:13:32.291686327 +0000 UTC m=+333.751073101" watchObservedRunningTime="2025-12-01 09:13:32.293063916 +0000 UTC m=+333.752450670" Dec 01 09:13:32 crc kubenswrapper[4867]: I1201 09:13:32.340002 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" podStartSLOduration=4.339986708 podStartE2EDuration="4.339986708s" podCreationTimestamp="2025-12-01 09:13:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:13:32.337038156 +0000 UTC m=+333.796424910" watchObservedRunningTime="2025-12-01 09:13:32.339986708 +0000 UTC m=+333.799373452" Dec 01 09:13:34 crc kubenswrapper[4867]: I1201 09:13:34.266180 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x"] Dec 01 09:13:35 crc kubenswrapper[4867]: I1201 09:13:35.288973 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" podUID="ddbe0ec4-87e3-4c89-8648-f17e661f6111" containerName="route-controller-manager" containerID="cri-o://adb2ad2c54e778790a73b05679482b1311ebf7666b3bf2e0e35c56f2d038c5ca" gracePeriod=30 Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.240074 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.281245 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7"] Dec 01 09:13:36 crc kubenswrapper[4867]: E1201 09:13:36.281448 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddbe0ec4-87e3-4c89-8648-f17e661f6111" containerName="route-controller-manager" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.281459 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddbe0ec4-87e3-4c89-8648-f17e661f6111" containerName="route-controller-manager" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.281550 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddbe0ec4-87e3-4c89-8648-f17e661f6111" containerName="route-controller-manager" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.281911 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.292805 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7"] Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.302415 4867 generic.go:334] "Generic (PLEG): container finished" podID="ddbe0ec4-87e3-4c89-8648-f17e661f6111" containerID="adb2ad2c54e778790a73b05679482b1311ebf7666b3bf2e0e35c56f2d038c5ca" exitCode=0 Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.302469 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" event={"ID":"ddbe0ec4-87e3-4c89-8648-f17e661f6111","Type":"ContainerDied","Data":"adb2ad2c54e778790a73b05679482b1311ebf7666b3bf2e0e35c56f2d038c5ca"} Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.302501 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" event={"ID":"ddbe0ec4-87e3-4c89-8648-f17e661f6111","Type":"ContainerDied","Data":"f8b48e8146bc82862c69ce0e3b5906739f881cb61f398bab8d857f0e63336ec2"} Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.302520 4867 scope.go:117] "RemoveContainer" containerID="adb2ad2c54e778790a73b05679482b1311ebf7666b3bf2e0e35c56f2d038c5ca" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.302634 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.320289 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddbe0ec4-87e3-4c89-8648-f17e661f6111-serving-cert\") pod \"ddbe0ec4-87e3-4c89-8648-f17e661f6111\" (UID: \"ddbe0ec4-87e3-4c89-8648-f17e661f6111\") " Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.320789 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36-config\") pod \"route-controller-manager-7658478444-mpsf7\" (UID: \"f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36\") " pod="openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.320841 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36-serving-cert\") pod \"route-controller-manager-7658478444-mpsf7\" (UID: \"f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36\") " pod="openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.320865 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rclx\" (UniqueName: \"kubernetes.io/projected/f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36-kube-api-access-9rclx\") pod \"route-controller-manager-7658478444-mpsf7\" (UID: \"f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36\") " pod="openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.320952 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36-client-ca\") pod \"route-controller-manager-7658478444-mpsf7\" (UID: \"f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36\") " pod="openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.343044 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddbe0ec4-87e3-4c89-8648-f17e661f6111-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ddbe0ec4-87e3-4c89-8648-f17e661f6111" (UID: "ddbe0ec4-87e3-4c89-8648-f17e661f6111"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.347532 4867 scope.go:117] "RemoveContainer" containerID="adb2ad2c54e778790a73b05679482b1311ebf7666b3bf2e0e35c56f2d038c5ca" Dec 01 09:13:36 crc kubenswrapper[4867]: E1201 09:13:36.348578 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adb2ad2c54e778790a73b05679482b1311ebf7666b3bf2e0e35c56f2d038c5ca\": container with ID starting with adb2ad2c54e778790a73b05679482b1311ebf7666b3bf2e0e35c56f2d038c5ca not found: ID does not exist" containerID="adb2ad2c54e778790a73b05679482b1311ebf7666b3bf2e0e35c56f2d038c5ca" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.348630 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adb2ad2c54e778790a73b05679482b1311ebf7666b3bf2e0e35c56f2d038c5ca"} err="failed to get container status \"adb2ad2c54e778790a73b05679482b1311ebf7666b3bf2e0e35c56f2d038c5ca\": rpc error: code = NotFound desc = could not find container \"adb2ad2c54e778790a73b05679482b1311ebf7666b3bf2e0e35c56f2d038c5ca\": container with ID starting with adb2ad2c54e778790a73b05679482b1311ebf7666b3bf2e0e35c56f2d038c5ca not found: ID does not exist" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.421180 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddbe0ec4-87e3-4c89-8648-f17e661f6111-config\") pod \"ddbe0ec4-87e3-4c89-8648-f17e661f6111\" (UID: \"ddbe0ec4-87e3-4c89-8648-f17e661f6111\") " Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.421229 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddbe0ec4-87e3-4c89-8648-f17e661f6111-client-ca\") pod \"ddbe0ec4-87e3-4c89-8648-f17e661f6111\" (UID: \"ddbe0ec4-87e3-4c89-8648-f17e661f6111\") " Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.421256 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqw82\" (UniqueName: \"kubernetes.io/projected/ddbe0ec4-87e3-4c89-8648-f17e661f6111-kube-api-access-hqw82\") pod \"ddbe0ec4-87e3-4c89-8648-f17e661f6111\" (UID: \"ddbe0ec4-87e3-4c89-8648-f17e661f6111\") " Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.421318 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36-config\") pod \"route-controller-manager-7658478444-mpsf7\" (UID: \"f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36\") " pod="openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.421340 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36-serving-cert\") pod \"route-controller-manager-7658478444-mpsf7\" (UID: \"f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36\") " pod="openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.421356 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rclx\" (UniqueName: \"kubernetes.io/projected/f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36-kube-api-access-9rclx\") pod \"route-controller-manager-7658478444-mpsf7\" (UID: \"f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36\") " pod="openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.421417 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36-client-ca\") pod \"route-controller-manager-7658478444-mpsf7\" (UID: \"f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36\") " pod="openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.421450 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddbe0ec4-87e3-4c89-8648-f17e661f6111-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.421734 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddbe0ec4-87e3-4c89-8648-f17e661f6111-client-ca" (OuterVolumeSpecName: "client-ca") pod "ddbe0ec4-87e3-4c89-8648-f17e661f6111" (UID: "ddbe0ec4-87e3-4c89-8648-f17e661f6111"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.421800 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddbe0ec4-87e3-4c89-8648-f17e661f6111-config" (OuterVolumeSpecName: "config") pod "ddbe0ec4-87e3-4c89-8648-f17e661f6111" (UID: "ddbe0ec4-87e3-4c89-8648-f17e661f6111"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.423221 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36-client-ca\") pod \"route-controller-manager-7658478444-mpsf7\" (UID: \"f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36\") " pod="openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.425184 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddbe0ec4-87e3-4c89-8648-f17e661f6111-kube-api-access-hqw82" (OuterVolumeSpecName: "kube-api-access-hqw82") pod "ddbe0ec4-87e3-4c89-8648-f17e661f6111" (UID: "ddbe0ec4-87e3-4c89-8648-f17e661f6111"). InnerVolumeSpecName "kube-api-access-hqw82". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.426601 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36-config\") pod \"route-controller-manager-7658478444-mpsf7\" (UID: \"f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36\") " pod="openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.430890 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36-serving-cert\") pod \"route-controller-manager-7658478444-mpsf7\" (UID: \"f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36\") " pod="openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.444772 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rclx\" (UniqueName: \"kubernetes.io/projected/f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36-kube-api-access-9rclx\") pod \"route-controller-manager-7658478444-mpsf7\" (UID: \"f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36\") " pod="openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.522122 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddbe0ec4-87e3-4c89-8648-f17e661f6111-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.522156 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddbe0ec4-87e3-4c89-8648-f17e661f6111-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.522169 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqw82\" (UniqueName: \"kubernetes.io/projected/ddbe0ec4-87e3-4c89-8648-f17e661f6111-kube-api-access-hqw82\") on node \"crc\" DevicePath \"\"" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.603749 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.677702 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x"] Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.683721 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cbb74dfd9-cmv6x"] Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.833649 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddbe0ec4-87e3-4c89-8648-f17e661f6111" path="/var/lib/kubelet/pods/ddbe0ec4-87e3-4c89-8648-f17e661f6111/volumes" Dec 01 09:13:36 crc kubenswrapper[4867]: I1201 09:13:36.989201 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7"] Dec 01 09:13:37 crc kubenswrapper[4867]: I1201 09:13:37.307904 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7" event={"ID":"f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36","Type":"ContainerStarted","Data":"d6dc0ba8cd0741d6eea216edfff3191560a4be11850b3c35145a0f19839f0160"} Dec 01 09:13:37 crc kubenswrapper[4867]: I1201 09:13:37.308294 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7" event={"ID":"f5ed43ce-6ef9-4bb6-a6fb-6c2c5e54ea36","Type":"ContainerStarted","Data":"073d4edddc4dc976b5d0579e6c040d44376a2bbc2ec4fcd932e3878eeddbfee7"} Dec 01 09:13:37 crc kubenswrapper[4867]: I1201 09:13:37.308548 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7" Dec 01 09:13:37 crc kubenswrapper[4867]: I1201 09:13:37.326072 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7" podStartSLOduration=3.326042703 podStartE2EDuration="3.326042703s" podCreationTimestamp="2025-12-01 09:13:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:13:37.323279175 +0000 UTC m=+338.782665939" watchObservedRunningTime="2025-12-01 09:13:37.326042703 +0000 UTC m=+338.785429457" Dec 01 09:13:37 crc kubenswrapper[4867]: I1201 09:13:37.513188 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7658478444-mpsf7" Dec 01 09:13:51 crc kubenswrapper[4867]: I1201 09:13:51.601665 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:13:51 crc kubenswrapper[4867]: I1201 09:13:51.602321 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:14:01 crc kubenswrapper[4867]: I1201 09:14:01.618343 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qqrrr"] Dec 01 09:14:01 crc kubenswrapper[4867]: I1201 09:14:01.619959 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qqrrr" Dec 01 09:14:01 crc kubenswrapper[4867]: I1201 09:14:01.622256 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 09:14:01 crc kubenswrapper[4867]: I1201 09:14:01.634394 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqrrr"] Dec 01 09:14:01 crc kubenswrapper[4867]: I1201 09:14:01.774902 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt55g\" (UniqueName: \"kubernetes.io/projected/899e126d-0b32-4d48-b5c4-acc83cea5de4-kube-api-access-wt55g\") pod \"redhat-marketplace-qqrrr\" (UID: \"899e126d-0b32-4d48-b5c4-acc83cea5de4\") " pod="openshift-marketplace/redhat-marketplace-qqrrr" Dec 01 09:14:01 crc kubenswrapper[4867]: I1201 09:14:01.774992 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/899e126d-0b32-4d48-b5c4-acc83cea5de4-catalog-content\") pod \"redhat-marketplace-qqrrr\" (UID: \"899e126d-0b32-4d48-b5c4-acc83cea5de4\") " pod="openshift-marketplace/redhat-marketplace-qqrrr" Dec 01 09:14:01 crc kubenswrapper[4867]: I1201 09:14:01.775088 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/899e126d-0b32-4d48-b5c4-acc83cea5de4-utilities\") pod \"redhat-marketplace-qqrrr\" (UID: \"899e126d-0b32-4d48-b5c4-acc83cea5de4\") " pod="openshift-marketplace/redhat-marketplace-qqrrr" Dec 01 09:14:01 crc kubenswrapper[4867]: I1201 09:14:01.875763 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt55g\" (UniqueName: \"kubernetes.io/projected/899e126d-0b32-4d48-b5c4-acc83cea5de4-kube-api-access-wt55g\") pod \"redhat-marketplace-qqrrr\" (UID: \"899e126d-0b32-4d48-b5c4-acc83cea5de4\") " pod="openshift-marketplace/redhat-marketplace-qqrrr" Dec 01 09:14:01 crc kubenswrapper[4867]: I1201 09:14:01.876101 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/899e126d-0b32-4d48-b5c4-acc83cea5de4-catalog-content\") pod \"redhat-marketplace-qqrrr\" (UID: \"899e126d-0b32-4d48-b5c4-acc83cea5de4\") " pod="openshift-marketplace/redhat-marketplace-qqrrr" Dec 01 09:14:01 crc kubenswrapper[4867]: I1201 09:14:01.876276 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/899e126d-0b32-4d48-b5c4-acc83cea5de4-utilities\") pod \"redhat-marketplace-qqrrr\" (UID: \"899e126d-0b32-4d48-b5c4-acc83cea5de4\") " pod="openshift-marketplace/redhat-marketplace-qqrrr" Dec 01 09:14:01 crc kubenswrapper[4867]: I1201 09:14:01.876755 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/899e126d-0b32-4d48-b5c4-acc83cea5de4-catalog-content\") pod \"redhat-marketplace-qqrrr\" (UID: \"899e126d-0b32-4d48-b5c4-acc83cea5de4\") " pod="openshift-marketplace/redhat-marketplace-qqrrr" Dec 01 09:14:01 crc kubenswrapper[4867]: I1201 09:14:01.876756 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/899e126d-0b32-4d48-b5c4-acc83cea5de4-utilities\") pod \"redhat-marketplace-qqrrr\" (UID: \"899e126d-0b32-4d48-b5c4-acc83cea5de4\") " pod="openshift-marketplace/redhat-marketplace-qqrrr" Dec 01 09:14:01 crc kubenswrapper[4867]: I1201 09:14:01.897290 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt55g\" (UniqueName: \"kubernetes.io/projected/899e126d-0b32-4d48-b5c4-acc83cea5de4-kube-api-access-wt55g\") pod \"redhat-marketplace-qqrrr\" (UID: \"899e126d-0b32-4d48-b5c4-acc83cea5de4\") " pod="openshift-marketplace/redhat-marketplace-qqrrr" Dec 01 09:14:01 crc kubenswrapper[4867]: I1201 09:14:01.936120 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qqrrr" Dec 01 09:14:02 crc kubenswrapper[4867]: I1201 09:14:02.217196 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tzz5n"] Dec 01 09:14:02 crc kubenswrapper[4867]: I1201 09:14:02.218661 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tzz5n" Dec 01 09:14:02 crc kubenswrapper[4867]: I1201 09:14:02.226212 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 09:14:02 crc kubenswrapper[4867]: I1201 09:14:02.230599 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tzz5n"] Dec 01 09:14:02 crc kubenswrapper[4867]: I1201 09:14:02.280274 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dd83b39-8ab6-4e60-9ff6-53129612dff2-utilities\") pod \"redhat-operators-tzz5n\" (UID: \"9dd83b39-8ab6-4e60-9ff6-53129612dff2\") " pod="openshift-marketplace/redhat-operators-tzz5n" Dec 01 09:14:02 crc kubenswrapper[4867]: I1201 09:14:02.280532 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dd83b39-8ab6-4e60-9ff6-53129612dff2-catalog-content\") pod \"redhat-operators-tzz5n\" (UID: \"9dd83b39-8ab6-4e60-9ff6-53129612dff2\") " pod="openshift-marketplace/redhat-operators-tzz5n" Dec 01 09:14:02 crc kubenswrapper[4867]: I1201 09:14:02.280598 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t72p\" (UniqueName: \"kubernetes.io/projected/9dd83b39-8ab6-4e60-9ff6-53129612dff2-kube-api-access-2t72p\") pod \"redhat-operators-tzz5n\" (UID: \"9dd83b39-8ab6-4e60-9ff6-53129612dff2\") " pod="openshift-marketplace/redhat-operators-tzz5n" Dec 01 09:14:02 crc kubenswrapper[4867]: I1201 09:14:02.333633 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqrrr"] Dec 01 09:14:02 crc kubenswrapper[4867]: I1201 09:14:02.381340 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dd83b39-8ab6-4e60-9ff6-53129612dff2-catalog-content\") pod \"redhat-operators-tzz5n\" (UID: \"9dd83b39-8ab6-4e60-9ff6-53129612dff2\") " pod="openshift-marketplace/redhat-operators-tzz5n" Dec 01 09:14:02 crc kubenswrapper[4867]: I1201 09:14:02.381427 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t72p\" (UniqueName: \"kubernetes.io/projected/9dd83b39-8ab6-4e60-9ff6-53129612dff2-kube-api-access-2t72p\") pod \"redhat-operators-tzz5n\" (UID: \"9dd83b39-8ab6-4e60-9ff6-53129612dff2\") " pod="openshift-marketplace/redhat-operators-tzz5n" Dec 01 09:14:02 crc kubenswrapper[4867]: I1201 09:14:02.381470 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dd83b39-8ab6-4e60-9ff6-53129612dff2-utilities\") pod \"redhat-operators-tzz5n\" (UID: \"9dd83b39-8ab6-4e60-9ff6-53129612dff2\") " pod="openshift-marketplace/redhat-operators-tzz5n" Dec 01 09:14:02 crc kubenswrapper[4867]: I1201 09:14:02.381980 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dd83b39-8ab6-4e60-9ff6-53129612dff2-catalog-content\") pod \"redhat-operators-tzz5n\" (UID: \"9dd83b39-8ab6-4e60-9ff6-53129612dff2\") " pod="openshift-marketplace/redhat-operators-tzz5n" Dec 01 09:14:02 crc kubenswrapper[4867]: I1201 09:14:02.382040 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dd83b39-8ab6-4e60-9ff6-53129612dff2-utilities\") pod \"redhat-operators-tzz5n\" (UID: \"9dd83b39-8ab6-4e60-9ff6-53129612dff2\") " pod="openshift-marketplace/redhat-operators-tzz5n" Dec 01 09:14:02 crc kubenswrapper[4867]: I1201 09:14:02.406469 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t72p\" (UniqueName: \"kubernetes.io/projected/9dd83b39-8ab6-4e60-9ff6-53129612dff2-kube-api-access-2t72p\") pod \"redhat-operators-tzz5n\" (UID: \"9dd83b39-8ab6-4e60-9ff6-53129612dff2\") " pod="openshift-marketplace/redhat-operators-tzz5n" Dec 01 09:14:02 crc kubenswrapper[4867]: I1201 09:14:02.424488 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqrrr" event={"ID":"899e126d-0b32-4d48-b5c4-acc83cea5de4","Type":"ContainerStarted","Data":"7a52bb6481151fca13eb5f5df811ec6e5ec682b8fee6deef2cbe2514679450c3"} Dec 01 09:14:02 crc kubenswrapper[4867]: I1201 09:14:02.540947 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tzz5n" Dec 01 09:14:02 crc kubenswrapper[4867]: I1201 09:14:02.940566 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tzz5n"] Dec 01 09:14:02 crc kubenswrapper[4867]: W1201 09:14:02.947635 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dd83b39_8ab6_4e60_9ff6_53129612dff2.slice/crio-fb75eff09257f0c167b1d2d9d10e2b485074459970b3e0b8235e004e257dcc96 WatchSource:0}: Error finding container fb75eff09257f0c167b1d2d9d10e2b485074459970b3e0b8235e004e257dcc96: Status 404 returned error can't find the container with id fb75eff09257f0c167b1d2d9d10e2b485074459970b3e0b8235e004e257dcc96 Dec 01 09:14:03 crc kubenswrapper[4867]: I1201 09:14:03.429710 4867 generic.go:334] "Generic (PLEG): container finished" podID="9dd83b39-8ab6-4e60-9ff6-53129612dff2" containerID="1170a7643f0cdc92c2664fe1ff7e6069be403af36c758ad0b1c168fab7584a70" exitCode=0 Dec 01 09:14:03 crc kubenswrapper[4867]: I1201 09:14:03.429776 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzz5n" event={"ID":"9dd83b39-8ab6-4e60-9ff6-53129612dff2","Type":"ContainerDied","Data":"1170a7643f0cdc92c2664fe1ff7e6069be403af36c758ad0b1c168fab7584a70"} Dec 01 09:14:03 crc kubenswrapper[4867]: I1201 09:14:03.429801 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzz5n" event={"ID":"9dd83b39-8ab6-4e60-9ff6-53129612dff2","Type":"ContainerStarted","Data":"fb75eff09257f0c167b1d2d9d10e2b485074459970b3e0b8235e004e257dcc96"} Dec 01 09:14:03 crc kubenswrapper[4867]: I1201 09:14:03.431932 4867 generic.go:334] "Generic (PLEG): container finished" podID="899e126d-0b32-4d48-b5c4-acc83cea5de4" containerID="7c41fb0ee091028634b16bb3d81baab07546e4ab446f9b69c0b6b0f8fe664bcc" exitCode=0 Dec 01 09:14:03 crc kubenswrapper[4867]: I1201 09:14:03.431957 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqrrr" event={"ID":"899e126d-0b32-4d48-b5c4-acc83cea5de4","Type":"ContainerDied","Data":"7c41fb0ee091028634b16bb3d81baab07546e4ab446f9b69c0b6b0f8fe664bcc"} Dec 01 09:14:04 crc kubenswrapper[4867]: I1201 09:14:04.021081 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h5cxt"] Dec 01 09:14:04 crc kubenswrapper[4867]: I1201 09:14:04.022098 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5cxt" Dec 01 09:14:04 crc kubenswrapper[4867]: I1201 09:14:04.024448 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 09:14:04 crc kubenswrapper[4867]: I1201 09:14:04.033785 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h5cxt"] Dec 01 09:14:04 crc kubenswrapper[4867]: I1201 09:14:04.202444 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4415af65-5e2b-472d-b687-54fa137bf02e-catalog-content\") pod \"community-operators-h5cxt\" (UID: \"4415af65-5e2b-472d-b687-54fa137bf02e\") " pod="openshift-marketplace/community-operators-h5cxt" Dec 01 09:14:04 crc kubenswrapper[4867]: I1201 09:14:04.202891 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn58n\" (UniqueName: \"kubernetes.io/projected/4415af65-5e2b-472d-b687-54fa137bf02e-kube-api-access-tn58n\") pod \"community-operators-h5cxt\" (UID: \"4415af65-5e2b-472d-b687-54fa137bf02e\") " pod="openshift-marketplace/community-operators-h5cxt" Dec 01 09:14:04 crc kubenswrapper[4867]: I1201 09:14:04.202925 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4415af65-5e2b-472d-b687-54fa137bf02e-utilities\") pod \"community-operators-h5cxt\" (UID: \"4415af65-5e2b-472d-b687-54fa137bf02e\") " pod="openshift-marketplace/community-operators-h5cxt" Dec 01 09:14:04 crc kubenswrapper[4867]: I1201 09:14:04.303710 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn58n\" (UniqueName: \"kubernetes.io/projected/4415af65-5e2b-472d-b687-54fa137bf02e-kube-api-access-tn58n\") pod \"community-operators-h5cxt\" (UID: \"4415af65-5e2b-472d-b687-54fa137bf02e\") " pod="openshift-marketplace/community-operators-h5cxt" Dec 01 09:14:04 crc kubenswrapper[4867]: I1201 09:14:04.303769 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4415af65-5e2b-472d-b687-54fa137bf02e-utilities\") pod \"community-operators-h5cxt\" (UID: \"4415af65-5e2b-472d-b687-54fa137bf02e\") " pod="openshift-marketplace/community-operators-h5cxt" Dec 01 09:14:04 crc kubenswrapper[4867]: I1201 09:14:04.303862 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4415af65-5e2b-472d-b687-54fa137bf02e-catalog-content\") pod \"community-operators-h5cxt\" (UID: \"4415af65-5e2b-472d-b687-54fa137bf02e\") " pod="openshift-marketplace/community-operators-h5cxt" Dec 01 09:14:04 crc kubenswrapper[4867]: I1201 09:14:04.304271 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4415af65-5e2b-472d-b687-54fa137bf02e-catalog-content\") pod \"community-operators-h5cxt\" (UID: \"4415af65-5e2b-472d-b687-54fa137bf02e\") " pod="openshift-marketplace/community-operators-h5cxt" Dec 01 09:14:04 crc kubenswrapper[4867]: I1201 09:14:04.304387 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4415af65-5e2b-472d-b687-54fa137bf02e-utilities\") pod \"community-operators-h5cxt\" (UID: \"4415af65-5e2b-472d-b687-54fa137bf02e\") " pod="openshift-marketplace/community-operators-h5cxt" Dec 01 09:14:04 crc kubenswrapper[4867]: I1201 09:14:04.329213 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn58n\" (UniqueName: \"kubernetes.io/projected/4415af65-5e2b-472d-b687-54fa137bf02e-kube-api-access-tn58n\") pod \"community-operators-h5cxt\" (UID: \"4415af65-5e2b-472d-b687-54fa137bf02e\") " pod="openshift-marketplace/community-operators-h5cxt" Dec 01 09:14:04 crc kubenswrapper[4867]: I1201 09:14:04.353656 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5cxt" Dec 01 09:14:04 crc kubenswrapper[4867]: I1201 09:14:04.441120 4867 generic.go:334] "Generic (PLEG): container finished" podID="899e126d-0b32-4d48-b5c4-acc83cea5de4" containerID="3196c8183afadb7c96bba2a4f468cc1651ae230b2d3822b647c62a63b5ca6b2d" exitCode=0 Dec 01 09:14:04 crc kubenswrapper[4867]: I1201 09:14:04.441162 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqrrr" event={"ID":"899e126d-0b32-4d48-b5c4-acc83cea5de4","Type":"ContainerDied","Data":"3196c8183afadb7c96bba2a4f468cc1651ae230b2d3822b647c62a63b5ca6b2d"} Dec 01 09:14:04 crc kubenswrapper[4867]: I1201 09:14:04.754769 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h5cxt"] Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.018366 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r6wmp"] Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.019643 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r6wmp" Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.021793 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.031770 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r6wmp"] Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.214407 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62ccb15d-a0d6-4799-87e4-99cf2489fa16-utilities\") pod \"certified-operators-r6wmp\" (UID: \"62ccb15d-a0d6-4799-87e4-99cf2489fa16\") " pod="openshift-marketplace/certified-operators-r6wmp" Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.214530 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dqnp\" (UniqueName: \"kubernetes.io/projected/62ccb15d-a0d6-4799-87e4-99cf2489fa16-kube-api-access-8dqnp\") pod \"certified-operators-r6wmp\" (UID: \"62ccb15d-a0d6-4799-87e4-99cf2489fa16\") " pod="openshift-marketplace/certified-operators-r6wmp" Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.214611 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62ccb15d-a0d6-4799-87e4-99cf2489fa16-catalog-content\") pod \"certified-operators-r6wmp\" (UID: \"62ccb15d-a0d6-4799-87e4-99cf2489fa16\") " pod="openshift-marketplace/certified-operators-r6wmp" Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.315589 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62ccb15d-a0d6-4799-87e4-99cf2489fa16-utilities\") pod \"certified-operators-r6wmp\" (UID: \"62ccb15d-a0d6-4799-87e4-99cf2489fa16\") " pod="openshift-marketplace/certified-operators-r6wmp" Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.315966 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dqnp\" (UniqueName: \"kubernetes.io/projected/62ccb15d-a0d6-4799-87e4-99cf2489fa16-kube-api-access-8dqnp\") pod \"certified-operators-r6wmp\" (UID: \"62ccb15d-a0d6-4799-87e4-99cf2489fa16\") " pod="openshift-marketplace/certified-operators-r6wmp" Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.316099 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62ccb15d-a0d6-4799-87e4-99cf2489fa16-catalog-content\") pod \"certified-operators-r6wmp\" (UID: \"62ccb15d-a0d6-4799-87e4-99cf2489fa16\") " pod="openshift-marketplace/certified-operators-r6wmp" Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.316174 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62ccb15d-a0d6-4799-87e4-99cf2489fa16-utilities\") pod \"certified-operators-r6wmp\" (UID: \"62ccb15d-a0d6-4799-87e4-99cf2489fa16\") " pod="openshift-marketplace/certified-operators-r6wmp" Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.316415 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62ccb15d-a0d6-4799-87e4-99cf2489fa16-catalog-content\") pod \"certified-operators-r6wmp\" (UID: \"62ccb15d-a0d6-4799-87e4-99cf2489fa16\") " pod="openshift-marketplace/certified-operators-r6wmp" Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.338434 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dqnp\" (UniqueName: \"kubernetes.io/projected/62ccb15d-a0d6-4799-87e4-99cf2489fa16-kube-api-access-8dqnp\") pod \"certified-operators-r6wmp\" (UID: \"62ccb15d-a0d6-4799-87e4-99cf2489fa16\") " pod="openshift-marketplace/certified-operators-r6wmp" Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.371036 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r6wmp" Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.470221 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqrrr" event={"ID":"899e126d-0b32-4d48-b5c4-acc83cea5de4","Type":"ContainerStarted","Data":"a324a4097cc96629ec895503c62406b40772b918ca408435ea4485e47a75a857"} Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.479562 4867 generic.go:334] "Generic (PLEG): container finished" podID="4415af65-5e2b-472d-b687-54fa137bf02e" containerID="beaf11dadb669322482eaec9bd11270e5487bccc34fccab5586d3dcb111fce8d" exitCode=0 Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.479642 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5cxt" event={"ID":"4415af65-5e2b-472d-b687-54fa137bf02e","Type":"ContainerDied","Data":"beaf11dadb669322482eaec9bd11270e5487bccc34fccab5586d3dcb111fce8d"} Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.479668 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5cxt" event={"ID":"4415af65-5e2b-472d-b687-54fa137bf02e","Type":"ContainerStarted","Data":"684ee07a07d5c496930fc507ac5f80fc8d78deb7876e90c2024be1f38a796409"} Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.485966 4867 generic.go:334] "Generic (PLEG): container finished" podID="9dd83b39-8ab6-4e60-9ff6-53129612dff2" containerID="e65e3c90b12e72a1f4774cde19986d8546588bf943a14e8ec755d3f3c11688b8" exitCode=0 Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.486006 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzz5n" event={"ID":"9dd83b39-8ab6-4e60-9ff6-53129612dff2","Type":"ContainerDied","Data":"e65e3c90b12e72a1f4774cde19986d8546588bf943a14e8ec755d3f3c11688b8"} Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.496595 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qqrrr" podStartSLOduration=2.855854662 podStartE2EDuration="4.496578584s" podCreationTimestamp="2025-12-01 09:14:01 +0000 UTC" firstStartedPulling="2025-12-01 09:14:03.435338945 +0000 UTC m=+364.894725699" lastFinishedPulling="2025-12-01 09:14:05.076062847 +0000 UTC m=+366.535449621" observedRunningTime="2025-12-01 09:14:05.493749095 +0000 UTC m=+366.953135839" watchObservedRunningTime="2025-12-01 09:14:05.496578584 +0000 UTC m=+366.955965338" Dec 01 09:14:05 crc kubenswrapper[4867]: I1201 09:14:05.786174 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r6wmp"] Dec 01 09:14:05 crc kubenswrapper[4867]: W1201 09:14:05.791349 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62ccb15d_a0d6_4799_87e4_99cf2489fa16.slice/crio-13e229ae5c372d6d4c751421da2420bd0c59f861d2961fd715e2515a0fbd5036 WatchSource:0}: Error finding container 13e229ae5c372d6d4c751421da2420bd0c59f861d2961fd715e2515a0fbd5036: Status 404 returned error can't find the container with id 13e229ae5c372d6d4c751421da2420bd0c59f861d2961fd715e2515a0fbd5036 Dec 01 09:14:06 crc kubenswrapper[4867]: I1201 09:14:06.494006 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5cxt" event={"ID":"4415af65-5e2b-472d-b687-54fa137bf02e","Type":"ContainerStarted","Data":"120c5351b81d9a84e66f1739aa68a940f5b090874a2da2a6625d688637e5ab61"} Dec 01 09:14:06 crc kubenswrapper[4867]: I1201 09:14:06.496113 4867 generic.go:334] "Generic (PLEG): container finished" podID="62ccb15d-a0d6-4799-87e4-99cf2489fa16" containerID="729e534dda5e61fd83b87c772760b93f040ec62592e7e1186863033d9cd1f275" exitCode=0 Dec 01 09:14:06 crc kubenswrapper[4867]: I1201 09:14:06.496172 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6wmp" event={"ID":"62ccb15d-a0d6-4799-87e4-99cf2489fa16","Type":"ContainerDied","Data":"729e534dda5e61fd83b87c772760b93f040ec62592e7e1186863033d9cd1f275"} Dec 01 09:14:06 crc kubenswrapper[4867]: I1201 09:14:06.496192 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6wmp" event={"ID":"62ccb15d-a0d6-4799-87e4-99cf2489fa16","Type":"ContainerStarted","Data":"13e229ae5c372d6d4c751421da2420bd0c59f861d2961fd715e2515a0fbd5036"} Dec 01 09:14:06 crc kubenswrapper[4867]: I1201 09:14:06.499432 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzz5n" event={"ID":"9dd83b39-8ab6-4e60-9ff6-53129612dff2","Type":"ContainerStarted","Data":"a8cbfc39946c027dde8cc2267dfe39fd842fd3db140992ed421df680fd7d77d5"} Dec 01 09:14:06 crc kubenswrapper[4867]: I1201 09:14:06.547172 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tzz5n" podStartSLOduration=1.868115473 podStartE2EDuration="4.547154137s" podCreationTimestamp="2025-12-01 09:14:02 +0000 UTC" firstStartedPulling="2025-12-01 09:14:03.431308441 +0000 UTC m=+364.890695195" lastFinishedPulling="2025-12-01 09:14:06.110347095 +0000 UTC m=+367.569733859" observedRunningTime="2025-12-01 09:14:06.529520715 +0000 UTC m=+367.988907469" watchObservedRunningTime="2025-12-01 09:14:06.547154137 +0000 UTC m=+368.006540891" Dec 01 09:14:07 crc kubenswrapper[4867]: I1201 09:14:07.506276 4867 generic.go:334] "Generic (PLEG): container finished" podID="4415af65-5e2b-472d-b687-54fa137bf02e" containerID="120c5351b81d9a84e66f1739aa68a940f5b090874a2da2a6625d688637e5ab61" exitCode=0 Dec 01 09:14:07 crc kubenswrapper[4867]: I1201 09:14:07.507728 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5cxt" event={"ID":"4415af65-5e2b-472d-b687-54fa137bf02e","Type":"ContainerDied","Data":"120c5351b81d9a84e66f1739aa68a940f5b090874a2da2a6625d688637e5ab61"} Dec 01 09:14:08 crc kubenswrapper[4867]: I1201 09:14:08.512738 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5cxt" event={"ID":"4415af65-5e2b-472d-b687-54fa137bf02e","Type":"ContainerStarted","Data":"11d0a161c0127c9e149173a3485333ff793f9e6e359fa9a68deb1af63faa761a"} Dec 01 09:14:08 crc kubenswrapper[4867]: I1201 09:14:08.515034 4867 generic.go:334] "Generic (PLEG): container finished" podID="62ccb15d-a0d6-4799-87e4-99cf2489fa16" containerID="583205d7ac3b252335647256ec888123d4c265e9ceae465067fcefe2a552b241" exitCode=0 Dec 01 09:14:08 crc kubenswrapper[4867]: I1201 09:14:08.515072 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6wmp" event={"ID":"62ccb15d-a0d6-4799-87e4-99cf2489fa16","Type":"ContainerDied","Data":"583205d7ac3b252335647256ec888123d4c265e9ceae465067fcefe2a552b241"} Dec 01 09:14:08 crc kubenswrapper[4867]: I1201 09:14:08.538668 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h5cxt" podStartSLOduration=1.9687768970000001 podStartE2EDuration="4.538607287s" podCreationTimestamp="2025-12-01 09:14:04 +0000 UTC" firstStartedPulling="2025-12-01 09:14:05.481167254 +0000 UTC m=+366.940554008" lastFinishedPulling="2025-12-01 09:14:08.050997644 +0000 UTC m=+369.510384398" observedRunningTime="2025-12-01 09:14:08.533100393 +0000 UTC m=+369.992487157" watchObservedRunningTime="2025-12-01 09:14:08.538607287 +0000 UTC m=+369.997994041" Dec 01 09:14:09 crc kubenswrapper[4867]: I1201 09:14:09.522739 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6wmp" event={"ID":"62ccb15d-a0d6-4799-87e4-99cf2489fa16","Type":"ContainerStarted","Data":"439cbbf0e51a3c8ed853f333e382dede260258d7f601a8eea07b503e13591b06"} Dec 01 09:14:09 crc kubenswrapper[4867]: I1201 09:14:09.543384 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r6wmp" podStartSLOduration=2.036746316 podStartE2EDuration="4.543353459s" podCreationTimestamp="2025-12-01 09:14:05 +0000 UTC" firstStartedPulling="2025-12-01 09:14:06.497972033 +0000 UTC m=+367.957358787" lastFinishedPulling="2025-12-01 09:14:09.004579176 +0000 UTC m=+370.463965930" observedRunningTime="2025-12-01 09:14:09.542461434 +0000 UTC m=+371.001848198" watchObservedRunningTime="2025-12-01 09:14:09.543353459 +0000 UTC m=+371.002740213" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.588411 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-z2s5j"] Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.589353 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.600623 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-z2s5j"] Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.730116 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-registry-certificates\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.730168 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.730204 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.730235 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l68s\" (UniqueName: \"kubernetes.io/projected/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-kube-api-access-4l68s\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.730269 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-trusted-ca\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.730458 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-bound-sa-token\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.730487 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.730513 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-registry-tls\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.760513 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.831630 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-registry-certificates\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.831938 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.832114 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l68s\" (UniqueName: \"kubernetes.io/projected/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-kube-api-access-4l68s\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.832235 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-trusted-ca\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.832336 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-bound-sa-token\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.832429 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.832520 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-registry-tls\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.832740 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.832985 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-registry-certificates\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.834496 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-trusted-ca\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.839739 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-registry-tls\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.841310 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.849197 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-bound-sa-token\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.852496 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l68s\" (UniqueName: \"kubernetes.io/projected/14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c-kube-api-access-4l68s\") pod \"image-registry-66df7c8f76-z2s5j\" (UID: \"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c\") " pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.904420 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.937663 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qqrrr" Dec 01 09:14:11 crc kubenswrapper[4867]: I1201 09:14:11.938529 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qqrrr" Dec 01 09:14:12 crc kubenswrapper[4867]: I1201 09:14:12.004966 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qqrrr" Dec 01 09:14:12 crc kubenswrapper[4867]: I1201 09:14:12.314900 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-z2s5j"] Dec 01 09:14:12 crc kubenswrapper[4867]: W1201 09:14:12.316527 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14cf982a_3cd6_44f4_87f2_3cd3e4e5df7c.slice/crio-2c793d2462bcdd719769294c8f6a930e9cf387ee5175f81f34424e7c4e9c1b6f WatchSource:0}: Error finding container 2c793d2462bcdd719769294c8f6a930e9cf387ee5175f81f34424e7c4e9c1b6f: Status 404 returned error can't find the container with id 2c793d2462bcdd719769294c8f6a930e9cf387ee5175f81f34424e7c4e9c1b6f Dec 01 09:14:12 crc kubenswrapper[4867]: I1201 09:14:12.539089 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" event={"ID":"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c","Type":"ContainerStarted","Data":"2c793d2462bcdd719769294c8f6a930e9cf387ee5175f81f34424e7c4e9c1b6f"} Dec 01 09:14:12 crc kubenswrapper[4867]: I1201 09:14:12.541454 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tzz5n" Dec 01 09:14:12 crc kubenswrapper[4867]: I1201 09:14:12.541494 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tzz5n" Dec 01 09:14:12 crc kubenswrapper[4867]: I1201 09:14:12.579615 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tzz5n" Dec 01 09:14:12 crc kubenswrapper[4867]: I1201 09:14:12.582059 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qqrrr" Dec 01 09:14:13 crc kubenswrapper[4867]: I1201 09:14:13.590377 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tzz5n" Dec 01 09:14:14 crc kubenswrapper[4867]: I1201 09:14:14.252289 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-58bf4dd977-w7pkh"] Dec 01 09:14:14 crc kubenswrapper[4867]: I1201 09:14:14.252828 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" podUID="01bfe5d2-9117-4932-b6d7-5eeeb790ce79" containerName="controller-manager" containerID="cri-o://521e1080c8cd80d9daa0a4690ee2aff081b5b548327dd8789c2ec6033086ecb3" gracePeriod=30 Dec 01 09:14:14 crc kubenswrapper[4867]: I1201 09:14:14.354472 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h5cxt" Dec 01 09:14:14 crc kubenswrapper[4867]: I1201 09:14:14.354527 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h5cxt" Dec 01 09:14:14 crc kubenswrapper[4867]: I1201 09:14:14.397845 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h5cxt" Dec 01 09:14:14 crc kubenswrapper[4867]: I1201 09:14:14.548596 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" event={"ID":"14cf982a-3cd6-44f4-87f2-3cd3e4e5df7c","Type":"ContainerStarted","Data":"e906f570c26e10ede24de4770eb3fdef1977879f9fc4503fb4b06bcacd2c98c4"} Dec 01 09:14:14 crc kubenswrapper[4867]: I1201 09:14:14.593989 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h5cxt" Dec 01 09:14:14 crc kubenswrapper[4867]: I1201 09:14:14.612485 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" podStartSLOduration=3.612468066 podStartE2EDuration="3.612468066s" podCreationTimestamp="2025-12-01 09:14:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:14:14.574951598 +0000 UTC m=+376.034338352" watchObservedRunningTime="2025-12-01 09:14:14.612468066 +0000 UTC m=+376.071854820" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.114482 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.275770 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-config\") pod \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\" (UID: \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\") " Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.275857 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-client-ca\") pod \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\" (UID: \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\") " Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.275880 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-proxy-ca-bundles\") pod \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\" (UID: \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\") " Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.275979 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-serving-cert\") pod \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\" (UID: \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\") " Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.276023 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbjd6\" (UniqueName: \"kubernetes.io/projected/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-kube-api-access-dbjd6\") pod \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\" (UID: \"01bfe5d2-9117-4932-b6d7-5eeeb790ce79\") " Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.276629 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-client-ca" (OuterVolumeSpecName: "client-ca") pod "01bfe5d2-9117-4932-b6d7-5eeeb790ce79" (UID: "01bfe5d2-9117-4932-b6d7-5eeeb790ce79"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.276658 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "01bfe5d2-9117-4932-b6d7-5eeeb790ce79" (UID: "01bfe5d2-9117-4932-b6d7-5eeeb790ce79"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.277194 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-config" (OuterVolumeSpecName: "config") pod "01bfe5d2-9117-4932-b6d7-5eeeb790ce79" (UID: "01bfe5d2-9117-4932-b6d7-5eeeb790ce79"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.280703 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01bfe5d2-9117-4932-b6d7-5eeeb790ce79" (UID: "01bfe5d2-9117-4932-b6d7-5eeeb790ce79"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.280711 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-kube-api-access-dbjd6" (OuterVolumeSpecName: "kube-api-access-dbjd6") pod "01bfe5d2-9117-4932-b6d7-5eeeb790ce79" (UID: "01bfe5d2-9117-4932-b6d7-5eeeb790ce79"). InnerVolumeSpecName "kube-api-access-dbjd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.371840 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r6wmp" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.371892 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r6wmp" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.377451 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.377493 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.377504 4867 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.377515 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.377525 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbjd6\" (UniqueName: \"kubernetes.io/projected/01bfe5d2-9117-4932-b6d7-5eeeb790ce79-kube-api-access-dbjd6\") on node \"crc\" DevicePath \"\"" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.399785 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl"] Dec 01 09:14:15 crc kubenswrapper[4867]: E1201 09:14:15.400263 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bfe5d2-9117-4932-b6d7-5eeeb790ce79" containerName="controller-manager" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.400290 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bfe5d2-9117-4932-b6d7-5eeeb790ce79" containerName="controller-manager" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.400534 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bfe5d2-9117-4932-b6d7-5eeeb790ce79" containerName="controller-manager" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.401287 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.405932 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl"] Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.419698 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r6wmp" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.554548 4867 generic.go:334] "Generic (PLEG): container finished" podID="01bfe5d2-9117-4932-b6d7-5eeeb790ce79" containerID="521e1080c8cd80d9daa0a4690ee2aff081b5b548327dd8789c2ec6033086ecb3" exitCode=0 Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.554601 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.554688 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" event={"ID":"01bfe5d2-9117-4932-b6d7-5eeeb790ce79","Type":"ContainerDied","Data":"521e1080c8cd80d9daa0a4690ee2aff081b5b548327dd8789c2ec6033086ecb3"} Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.554757 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58bf4dd977-w7pkh" event={"ID":"01bfe5d2-9117-4932-b6d7-5eeeb790ce79","Type":"ContainerDied","Data":"0bea9190f23694740cead4e4bf2eb9868c68747d21be36beee24a23dcb5f8e63"} Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.554775 4867 scope.go:117] "RemoveContainer" containerID="521e1080c8cd80d9daa0a4690ee2aff081b5b548327dd8789c2ec6033086ecb3" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.555404 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.576199 4867 scope.go:117] "RemoveContainer" containerID="521e1080c8cd80d9daa0a4690ee2aff081b5b548327dd8789c2ec6033086ecb3" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.585172 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-58bf4dd977-w7pkh"] Dec 01 09:14:15 crc kubenswrapper[4867]: E1201 09:14:15.585390 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"521e1080c8cd80d9daa0a4690ee2aff081b5b548327dd8789c2ec6033086ecb3\": container with ID starting with 521e1080c8cd80d9daa0a4690ee2aff081b5b548327dd8789c2ec6033086ecb3 not found: ID does not exist" containerID="521e1080c8cd80d9daa0a4690ee2aff081b5b548327dd8789c2ec6033086ecb3" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.585428 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"521e1080c8cd80d9daa0a4690ee2aff081b5b548327dd8789c2ec6033086ecb3"} err="failed to get container status \"521e1080c8cd80d9daa0a4690ee2aff081b5b548327dd8789c2ec6033086ecb3\": rpc error: code = NotFound desc = could not find container \"521e1080c8cd80d9daa0a4690ee2aff081b5b548327dd8789c2ec6033086ecb3\": container with ID starting with 521e1080c8cd80d9daa0a4690ee2aff081b5b548327dd8789c2ec6033086ecb3 not found: ID does not exist" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.586204 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae485bc7-fcdc-4bba-9817-6cc99a389dcb-serving-cert\") pod \"controller-manager-58b7c6ff48-6sxpl\" (UID: \"ae485bc7-fcdc-4bba-9817-6cc99a389dcb\") " pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.586251 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8m6f\" (UniqueName: \"kubernetes.io/projected/ae485bc7-fcdc-4bba-9817-6cc99a389dcb-kube-api-access-h8m6f\") pod \"controller-manager-58b7c6ff48-6sxpl\" (UID: \"ae485bc7-fcdc-4bba-9817-6cc99a389dcb\") " pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.586283 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae485bc7-fcdc-4bba-9817-6cc99a389dcb-proxy-ca-bundles\") pod \"controller-manager-58b7c6ff48-6sxpl\" (UID: \"ae485bc7-fcdc-4bba-9817-6cc99a389dcb\") " pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.586321 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae485bc7-fcdc-4bba-9817-6cc99a389dcb-client-ca\") pod \"controller-manager-58b7c6ff48-6sxpl\" (UID: \"ae485bc7-fcdc-4bba-9817-6cc99a389dcb\") " pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.586342 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae485bc7-fcdc-4bba-9817-6cc99a389dcb-config\") pod \"controller-manager-58b7c6ff48-6sxpl\" (UID: \"ae485bc7-fcdc-4bba-9817-6cc99a389dcb\") " pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.590679 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-58bf4dd977-w7pkh"] Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.602597 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r6wmp" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.687994 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae485bc7-fcdc-4bba-9817-6cc99a389dcb-serving-cert\") pod \"controller-manager-58b7c6ff48-6sxpl\" (UID: \"ae485bc7-fcdc-4bba-9817-6cc99a389dcb\") " pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.688057 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8m6f\" (UniqueName: \"kubernetes.io/projected/ae485bc7-fcdc-4bba-9817-6cc99a389dcb-kube-api-access-h8m6f\") pod \"controller-manager-58b7c6ff48-6sxpl\" (UID: \"ae485bc7-fcdc-4bba-9817-6cc99a389dcb\") " pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.688090 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae485bc7-fcdc-4bba-9817-6cc99a389dcb-proxy-ca-bundles\") pod \"controller-manager-58b7c6ff48-6sxpl\" (UID: \"ae485bc7-fcdc-4bba-9817-6cc99a389dcb\") " pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.688138 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae485bc7-fcdc-4bba-9817-6cc99a389dcb-config\") pod \"controller-manager-58b7c6ff48-6sxpl\" (UID: \"ae485bc7-fcdc-4bba-9817-6cc99a389dcb\") " pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.688152 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae485bc7-fcdc-4bba-9817-6cc99a389dcb-client-ca\") pod \"controller-manager-58b7c6ff48-6sxpl\" (UID: \"ae485bc7-fcdc-4bba-9817-6cc99a389dcb\") " pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.689593 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae485bc7-fcdc-4bba-9817-6cc99a389dcb-client-ca\") pod \"controller-manager-58b7c6ff48-6sxpl\" (UID: \"ae485bc7-fcdc-4bba-9817-6cc99a389dcb\") " pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.690612 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae485bc7-fcdc-4bba-9817-6cc99a389dcb-proxy-ca-bundles\") pod \"controller-manager-58b7c6ff48-6sxpl\" (UID: \"ae485bc7-fcdc-4bba-9817-6cc99a389dcb\") " pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.690658 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae485bc7-fcdc-4bba-9817-6cc99a389dcb-config\") pod \"controller-manager-58b7c6ff48-6sxpl\" (UID: \"ae485bc7-fcdc-4bba-9817-6cc99a389dcb\") " pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.698662 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae485bc7-fcdc-4bba-9817-6cc99a389dcb-serving-cert\") pod \"controller-manager-58b7c6ff48-6sxpl\" (UID: \"ae485bc7-fcdc-4bba-9817-6cc99a389dcb\") " pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.705594 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8m6f\" (UniqueName: \"kubernetes.io/projected/ae485bc7-fcdc-4bba-9817-6cc99a389dcb-kube-api-access-h8m6f\") pod \"controller-manager-58b7c6ff48-6sxpl\" (UID: \"ae485bc7-fcdc-4bba-9817-6cc99a389dcb\") " pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.717565 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" Dec 01 09:14:15 crc kubenswrapper[4867]: I1201 09:14:15.920181 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl"] Dec 01 09:14:16 crc kubenswrapper[4867]: I1201 09:14:16.564387 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" event={"ID":"ae485bc7-fcdc-4bba-9817-6cc99a389dcb","Type":"ContainerStarted","Data":"327edeec47e50044eacdd64e2bf6992eda85f7126ad019b30234771851cc9882"} Dec 01 09:14:16 crc kubenswrapper[4867]: I1201 09:14:16.835220 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01bfe5d2-9117-4932-b6d7-5eeeb790ce79" path="/var/lib/kubelet/pods/01bfe5d2-9117-4932-b6d7-5eeeb790ce79/volumes" Dec 01 09:14:17 crc kubenswrapper[4867]: I1201 09:14:17.572848 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" event={"ID":"ae485bc7-fcdc-4bba-9817-6cc99a389dcb","Type":"ContainerStarted","Data":"690f8e6c0d2a073caceb1941fe521660a7337faf4d4820fbe1f1f1a7bae01a77"} Dec 01 09:14:18 crc kubenswrapper[4867]: I1201 09:14:18.578210 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" Dec 01 09:14:18 crc kubenswrapper[4867]: I1201 09:14:18.583854 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" Dec 01 09:14:18 crc kubenswrapper[4867]: I1201 09:14:18.602922 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-58b7c6ff48-6sxpl" podStartSLOduration=4.602900066 podStartE2EDuration="4.602900066s" podCreationTimestamp="2025-12-01 09:14:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:14:18.595513969 +0000 UTC m=+380.054900733" watchObservedRunningTime="2025-12-01 09:14:18.602900066 +0000 UTC m=+380.062286820" Dec 01 09:14:21 crc kubenswrapper[4867]: I1201 09:14:21.600757 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:14:21 crc kubenswrapper[4867]: I1201 09:14:21.601340 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:14:31 crc kubenswrapper[4867]: I1201 09:14:31.909344 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-z2s5j" Dec 01 09:14:31 crc kubenswrapper[4867]: I1201 09:14:31.984926 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-n9k7d"] Dec 01 09:14:51 crc kubenswrapper[4867]: I1201 09:14:51.601544 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:14:51 crc kubenswrapper[4867]: I1201 09:14:51.602131 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:14:51 crc kubenswrapper[4867]: I1201 09:14:51.602226 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:14:51 crc kubenswrapper[4867]: I1201 09:14:51.602904 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b87fb643a4876e9b1ddead390c05ce1df38b99a37c0239fe24fe120587d956a8"} pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:14:51 crc kubenswrapper[4867]: I1201 09:14:51.602970 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" containerID="cri-o://b87fb643a4876e9b1ddead390c05ce1df38b99a37c0239fe24fe120587d956a8" gracePeriod=600 Dec 01 09:14:51 crc kubenswrapper[4867]: I1201 09:14:51.768230 4867 generic.go:334] "Generic (PLEG): container finished" podID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerID="b87fb643a4876e9b1ddead390c05ce1df38b99a37c0239fe24fe120587d956a8" exitCode=0 Dec 01 09:14:51 crc kubenswrapper[4867]: I1201 09:14:51.768280 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerDied","Data":"b87fb643a4876e9b1ddead390c05ce1df38b99a37c0239fe24fe120587d956a8"} Dec 01 09:14:51 crc kubenswrapper[4867]: I1201 09:14:51.768369 4867 scope.go:117] "RemoveContainer" containerID="a83e5aa81cc688e3c1a5b3ff2b18bc0cfe92f0ff6291b630c1ffde8384e657be" Dec 01 09:14:52 crc kubenswrapper[4867]: I1201 09:14:52.777416 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"0847c17cfa5036057c123c535bf976ed7bbb5b492abe611ef17929fa45cab386"} Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.023508 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" podUID="d00d9bfd-cd31-44f5-8b56-d14af3823d29" containerName="registry" containerID="cri-o://7c1e50ad11ac1bee0583425fd2316089f6a208a1d119f1e1ea566a46f8b55298" gracePeriod=30 Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.385538 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.464787 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95dlj\" (UniqueName: \"kubernetes.io/projected/d00d9bfd-cd31-44f5-8b56-d14af3823d29-kube-api-access-95dlj\") pod \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.464899 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d00d9bfd-cd31-44f5-8b56-d14af3823d29-registry-certificates\") pod \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.464952 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d00d9bfd-cd31-44f5-8b56-d14af3823d29-bound-sa-token\") pod \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.464978 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d00d9bfd-cd31-44f5-8b56-d14af3823d29-trusted-ca\") pod \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.465012 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d00d9bfd-cd31-44f5-8b56-d14af3823d29-installation-pull-secrets\") pod \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.465039 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d00d9bfd-cd31-44f5-8b56-d14af3823d29-registry-tls\") pod \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.465093 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d00d9bfd-cd31-44f5-8b56-d14af3823d29-ca-trust-extracted\") pod \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.465253 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\" (UID: \"d00d9bfd-cd31-44f5-8b56-d14af3823d29\") " Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.466298 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d00d9bfd-cd31-44f5-8b56-d14af3823d29-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d00d9bfd-cd31-44f5-8b56-d14af3823d29" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.473054 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d00d9bfd-cd31-44f5-8b56-d14af3823d29-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d00d9bfd-cd31-44f5-8b56-d14af3823d29" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.473797 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d00d9bfd-cd31-44f5-8b56-d14af3823d29-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d00d9bfd-cd31-44f5-8b56-d14af3823d29" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.476199 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d00d9bfd-cd31-44f5-8b56-d14af3823d29-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d00d9bfd-cd31-44f5-8b56-d14af3823d29" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.476384 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d00d9bfd-cd31-44f5-8b56-d14af3823d29-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d00d9bfd-cd31-44f5-8b56-d14af3823d29" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.477946 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d00d9bfd-cd31-44f5-8b56-d14af3823d29-kube-api-access-95dlj" (OuterVolumeSpecName: "kube-api-access-95dlj") pod "d00d9bfd-cd31-44f5-8b56-d14af3823d29" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29"). InnerVolumeSpecName "kube-api-access-95dlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.485912 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d00d9bfd-cd31-44f5-8b56-d14af3823d29-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d00d9bfd-cd31-44f5-8b56-d14af3823d29" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.488549 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "d00d9bfd-cd31-44f5-8b56-d14af3823d29" (UID: "d00d9bfd-cd31-44f5-8b56-d14af3823d29"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.566189 4867 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d00d9bfd-cd31-44f5-8b56-d14af3823d29-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.566221 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95dlj\" (UniqueName: \"kubernetes.io/projected/d00d9bfd-cd31-44f5-8b56-d14af3823d29-kube-api-access-95dlj\") on node \"crc\" DevicePath \"\"" Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.566232 4867 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d00d9bfd-cd31-44f5-8b56-d14af3823d29-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.566241 4867 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d00d9bfd-cd31-44f5-8b56-d14af3823d29-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.566249 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d00d9bfd-cd31-44f5-8b56-d14af3823d29-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.566257 4867 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d00d9bfd-cd31-44f5-8b56-d14af3823d29-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.566265 4867 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d00d9bfd-cd31-44f5-8b56-d14af3823d29-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.812322 4867 generic.go:334] "Generic (PLEG): container finished" podID="d00d9bfd-cd31-44f5-8b56-d14af3823d29" containerID="7c1e50ad11ac1bee0583425fd2316089f6a208a1d119f1e1ea566a46f8b55298" exitCode=0 Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.812359 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.812382 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" event={"ID":"d00d9bfd-cd31-44f5-8b56-d14af3823d29","Type":"ContainerDied","Data":"7c1e50ad11ac1bee0583425fd2316089f6a208a1d119f1e1ea566a46f8b55298"} Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.812466 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-n9k7d" event={"ID":"d00d9bfd-cd31-44f5-8b56-d14af3823d29","Type":"ContainerDied","Data":"3bed272814f2d34bd9f8a900983de2d953eabbd2a6e32418cda3103b6e2f789d"} Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.812503 4867 scope.go:117] "RemoveContainer" containerID="7c1e50ad11ac1bee0583425fd2316089f6a208a1d119f1e1ea566a46f8b55298" Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.831482 4867 scope.go:117] "RemoveContainer" containerID="7c1e50ad11ac1bee0583425fd2316089f6a208a1d119f1e1ea566a46f8b55298" Dec 01 09:14:57 crc kubenswrapper[4867]: E1201 09:14:57.832287 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c1e50ad11ac1bee0583425fd2316089f6a208a1d119f1e1ea566a46f8b55298\": container with ID starting with 7c1e50ad11ac1bee0583425fd2316089f6a208a1d119f1e1ea566a46f8b55298 not found: ID does not exist" containerID="7c1e50ad11ac1bee0583425fd2316089f6a208a1d119f1e1ea566a46f8b55298" Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.832336 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c1e50ad11ac1bee0583425fd2316089f6a208a1d119f1e1ea566a46f8b55298"} err="failed to get container status \"7c1e50ad11ac1bee0583425fd2316089f6a208a1d119f1e1ea566a46f8b55298\": rpc error: code = NotFound desc = could not find container \"7c1e50ad11ac1bee0583425fd2316089f6a208a1d119f1e1ea566a46f8b55298\": container with ID starting with 7c1e50ad11ac1bee0583425fd2316089f6a208a1d119f1e1ea566a46f8b55298 not found: ID does not exist" Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.848048 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-n9k7d"] Dec 01 09:14:57 crc kubenswrapper[4867]: I1201 09:14:57.854638 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-n9k7d"] Dec 01 09:14:58 crc kubenswrapper[4867]: I1201 09:14:58.833612 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d00d9bfd-cd31-44f5-8b56-d14af3823d29" path="/var/lib/kubelet/pods/d00d9bfd-cd31-44f5-8b56-d14af3823d29/volumes" Dec 01 09:15:00 crc kubenswrapper[4867]: I1201 09:15:00.165507 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz"] Dec 01 09:15:00 crc kubenswrapper[4867]: E1201 09:15:00.165894 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d00d9bfd-cd31-44f5-8b56-d14af3823d29" containerName="registry" Dec 01 09:15:00 crc kubenswrapper[4867]: I1201 09:15:00.165913 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="d00d9bfd-cd31-44f5-8b56-d14af3823d29" containerName="registry" Dec 01 09:15:00 crc kubenswrapper[4867]: I1201 09:15:00.166063 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="d00d9bfd-cd31-44f5-8b56-d14af3823d29" containerName="registry" Dec 01 09:15:00 crc kubenswrapper[4867]: I1201 09:15:00.166798 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz" Dec 01 09:15:00 crc kubenswrapper[4867]: I1201 09:15:00.168780 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 09:15:00 crc kubenswrapper[4867]: I1201 09:15:00.173243 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 09:15:00 crc kubenswrapper[4867]: I1201 09:15:00.176000 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz"] Dec 01 09:15:00 crc kubenswrapper[4867]: I1201 09:15:00.300444 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/658aa664-9092-421c-ab73-d7a75baff7f4-secret-volume\") pod \"collect-profiles-29409675-r5lsz\" (UID: \"658aa664-9092-421c-ab73-d7a75baff7f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz" Dec 01 09:15:00 crc kubenswrapper[4867]: I1201 09:15:00.300870 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/658aa664-9092-421c-ab73-d7a75baff7f4-config-volume\") pod \"collect-profiles-29409675-r5lsz\" (UID: \"658aa664-9092-421c-ab73-d7a75baff7f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz" Dec 01 09:15:00 crc kubenswrapper[4867]: I1201 09:15:00.300926 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8zvn\" (UniqueName: \"kubernetes.io/projected/658aa664-9092-421c-ab73-d7a75baff7f4-kube-api-access-f8zvn\") pod \"collect-profiles-29409675-r5lsz\" (UID: \"658aa664-9092-421c-ab73-d7a75baff7f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz" Dec 01 09:15:00 crc kubenswrapper[4867]: I1201 09:15:00.402441 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8zvn\" (UniqueName: \"kubernetes.io/projected/658aa664-9092-421c-ab73-d7a75baff7f4-kube-api-access-f8zvn\") pod \"collect-profiles-29409675-r5lsz\" (UID: \"658aa664-9092-421c-ab73-d7a75baff7f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz" Dec 01 09:15:00 crc kubenswrapper[4867]: I1201 09:15:00.402493 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/658aa664-9092-421c-ab73-d7a75baff7f4-secret-volume\") pod \"collect-profiles-29409675-r5lsz\" (UID: \"658aa664-9092-421c-ab73-d7a75baff7f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz" Dec 01 09:15:00 crc kubenswrapper[4867]: I1201 09:15:00.402551 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/658aa664-9092-421c-ab73-d7a75baff7f4-config-volume\") pod \"collect-profiles-29409675-r5lsz\" (UID: \"658aa664-9092-421c-ab73-d7a75baff7f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz" Dec 01 09:15:00 crc kubenswrapper[4867]: I1201 09:15:00.403326 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/658aa664-9092-421c-ab73-d7a75baff7f4-config-volume\") pod \"collect-profiles-29409675-r5lsz\" (UID: \"658aa664-9092-421c-ab73-d7a75baff7f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz" Dec 01 09:15:00 crc kubenswrapper[4867]: I1201 09:15:00.407402 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/658aa664-9092-421c-ab73-d7a75baff7f4-secret-volume\") pod \"collect-profiles-29409675-r5lsz\" (UID: \"658aa664-9092-421c-ab73-d7a75baff7f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz" Dec 01 09:15:00 crc kubenswrapper[4867]: I1201 09:15:00.423163 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8zvn\" (UniqueName: \"kubernetes.io/projected/658aa664-9092-421c-ab73-d7a75baff7f4-kube-api-access-f8zvn\") pod \"collect-profiles-29409675-r5lsz\" (UID: \"658aa664-9092-421c-ab73-d7a75baff7f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz" Dec 01 09:15:00 crc kubenswrapper[4867]: I1201 09:15:00.482052 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz" Dec 01 09:15:00 crc kubenswrapper[4867]: I1201 09:15:00.735545 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz"] Dec 01 09:15:00 crc kubenswrapper[4867]: I1201 09:15:00.834843 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz" event={"ID":"658aa664-9092-421c-ab73-d7a75baff7f4","Type":"ContainerStarted","Data":"45d7498f6f4d9c684c80172631e6605b3b475b71f6218c386cd3f2648c53b66f"} Dec 01 09:15:01 crc kubenswrapper[4867]: I1201 09:15:01.836492 4867 generic.go:334] "Generic (PLEG): container finished" podID="658aa664-9092-421c-ab73-d7a75baff7f4" containerID="dc4f853ce1ec07b75ab1469fad3535e1a1f1f34b42b8e2485f306b5224b396f1" exitCode=0 Dec 01 09:15:01 crc kubenswrapper[4867]: I1201 09:15:01.836797 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz" event={"ID":"658aa664-9092-421c-ab73-d7a75baff7f4","Type":"ContainerDied","Data":"dc4f853ce1ec07b75ab1469fad3535e1a1f1f34b42b8e2485f306b5224b396f1"} Dec 01 09:15:03 crc kubenswrapper[4867]: I1201 09:15:03.059271 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz" Dec 01 09:15:03 crc kubenswrapper[4867]: I1201 09:15:03.160280 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/658aa664-9092-421c-ab73-d7a75baff7f4-config-volume\") pod \"658aa664-9092-421c-ab73-d7a75baff7f4\" (UID: \"658aa664-9092-421c-ab73-d7a75baff7f4\") " Dec 01 09:15:03 crc kubenswrapper[4867]: I1201 09:15:03.160360 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8zvn\" (UniqueName: \"kubernetes.io/projected/658aa664-9092-421c-ab73-d7a75baff7f4-kube-api-access-f8zvn\") pod \"658aa664-9092-421c-ab73-d7a75baff7f4\" (UID: \"658aa664-9092-421c-ab73-d7a75baff7f4\") " Dec 01 09:15:03 crc kubenswrapper[4867]: I1201 09:15:03.160439 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/658aa664-9092-421c-ab73-d7a75baff7f4-secret-volume\") pod \"658aa664-9092-421c-ab73-d7a75baff7f4\" (UID: \"658aa664-9092-421c-ab73-d7a75baff7f4\") " Dec 01 09:15:03 crc kubenswrapper[4867]: I1201 09:15:03.161168 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/658aa664-9092-421c-ab73-d7a75baff7f4-config-volume" (OuterVolumeSpecName: "config-volume") pod "658aa664-9092-421c-ab73-d7a75baff7f4" (UID: "658aa664-9092-421c-ab73-d7a75baff7f4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:03 crc kubenswrapper[4867]: I1201 09:15:03.161678 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/658aa664-9092-421c-ab73-d7a75baff7f4-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:03 crc kubenswrapper[4867]: I1201 09:15:03.165045 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/658aa664-9092-421c-ab73-d7a75baff7f4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "658aa664-9092-421c-ab73-d7a75baff7f4" (UID: "658aa664-9092-421c-ab73-d7a75baff7f4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:03 crc kubenswrapper[4867]: I1201 09:15:03.165101 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/658aa664-9092-421c-ab73-d7a75baff7f4-kube-api-access-f8zvn" (OuterVolumeSpecName: "kube-api-access-f8zvn") pod "658aa664-9092-421c-ab73-d7a75baff7f4" (UID: "658aa664-9092-421c-ab73-d7a75baff7f4"). InnerVolumeSpecName "kube-api-access-f8zvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:03 crc kubenswrapper[4867]: I1201 09:15:03.262596 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8zvn\" (UniqueName: \"kubernetes.io/projected/658aa664-9092-421c-ab73-d7a75baff7f4-kube-api-access-f8zvn\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:03 crc kubenswrapper[4867]: I1201 09:15:03.262634 4867 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/658aa664-9092-421c-ab73-d7a75baff7f4-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:03 crc kubenswrapper[4867]: I1201 09:15:03.851304 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz" event={"ID":"658aa664-9092-421c-ab73-d7a75baff7f4","Type":"ContainerDied","Data":"45d7498f6f4d9c684c80172631e6605b3b475b71f6218c386cd3f2648c53b66f"} Dec 01 09:15:03 crc kubenswrapper[4867]: I1201 09:15:03.851595 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45d7498f6f4d9c684c80172631e6605b3b475b71f6218c386cd3f2648c53b66f" Dec 01 09:15:03 crc kubenswrapper[4867]: I1201 09:15:03.851392 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz" Dec 01 09:16:51 crc kubenswrapper[4867]: I1201 09:16:51.601499 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:16:51 crc kubenswrapper[4867]: I1201 09:16:51.602148 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:16:59 crc kubenswrapper[4867]: I1201 09:16:59.005339 4867 scope.go:117] "RemoveContainer" containerID="1665cf0223812cc9388614badc6496840c24c829fb8af49496e17c7ff8f86ee5" Dec 01 09:17:21 crc kubenswrapper[4867]: I1201 09:17:21.601313 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:17:21 crc kubenswrapper[4867]: I1201 09:17:21.601987 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:17:51 crc kubenswrapper[4867]: I1201 09:17:51.601274 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:17:51 crc kubenswrapper[4867]: I1201 09:17:51.601776 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:17:51 crc kubenswrapper[4867]: I1201 09:17:51.601849 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:17:51 crc kubenswrapper[4867]: I1201 09:17:51.602318 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0847c17cfa5036057c123c535bf976ed7bbb5b492abe611ef17929fa45cab386"} pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:17:51 crc kubenswrapper[4867]: I1201 09:17:51.602370 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" containerID="cri-o://0847c17cfa5036057c123c535bf976ed7bbb5b492abe611ef17929fa45cab386" gracePeriod=600 Dec 01 09:17:51 crc kubenswrapper[4867]: I1201 09:17:51.721068 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerDied","Data":"0847c17cfa5036057c123c535bf976ed7bbb5b492abe611ef17929fa45cab386"} Dec 01 09:17:51 crc kubenswrapper[4867]: I1201 09:17:51.721125 4867 scope.go:117] "RemoveContainer" containerID="b87fb643a4876e9b1ddead390c05ce1df38b99a37c0239fe24fe120587d956a8" Dec 01 09:17:51 crc kubenswrapper[4867]: I1201 09:17:51.721069 4867 generic.go:334] "Generic (PLEG): container finished" podID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerID="0847c17cfa5036057c123c535bf976ed7bbb5b492abe611ef17929fa45cab386" exitCode=0 Dec 01 09:17:52 crc kubenswrapper[4867]: I1201 09:17:52.727568 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"fa8ac94dfac3773a1b35360216b60b2041c8fba117bde3b6f4dcf7bb4fc033b2"} Dec 01 09:19:51 crc kubenswrapper[4867]: I1201 09:19:51.601128 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:19:51 crc kubenswrapper[4867]: I1201 09:19:51.601683 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.383562 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-njkbq"] Dec 01 09:20:14 crc kubenswrapper[4867]: E1201 09:20:14.384512 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658aa664-9092-421c-ab73-d7a75baff7f4" containerName="collect-profiles" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.384528 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="658aa664-9092-421c-ab73-d7a75baff7f4" containerName="collect-profiles" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.384648 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="658aa664-9092-421c-ab73-d7a75baff7f4" containerName="collect-profiles" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.385061 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-njkbq" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.385461 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-78xzp"] Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.386140 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-78xzp" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.392313 4867 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-6njzp" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.392495 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.392617 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.394532 4867 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-pf77b" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.410983 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-78xzp"] Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.421854 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-njkbq"] Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.429101 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-nwld2"] Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.429956 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-nwld2" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.434450 4867 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-vjwrl" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.449398 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-nwld2"] Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.531165 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p55cm\" (UniqueName: \"kubernetes.io/projected/6b09ecf0-40b3-4271-97da-a662a4b427d6-kube-api-access-p55cm\") pod \"cert-manager-cainjector-7f985d654d-njkbq\" (UID: \"6b09ecf0-40b3-4271-97da-a662a4b427d6\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-njkbq" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.531349 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtl6q\" (UniqueName: \"kubernetes.io/projected/210c03bc-36ef-4bc0-ba17-db783a56d470-kube-api-access-vtl6q\") pod \"cert-manager-5b446d88c5-78xzp\" (UID: \"210c03bc-36ef-4bc0-ba17-db783a56d470\") " pod="cert-manager/cert-manager-5b446d88c5-78xzp" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.531441 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mb62\" (UniqueName: \"kubernetes.io/projected/de894f99-4158-4096-b100-4758130c6c12-kube-api-access-7mb62\") pod \"cert-manager-webhook-5655c58dd6-nwld2\" (UID: \"de894f99-4158-4096-b100-4758130c6c12\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-nwld2" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.632193 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mb62\" (UniqueName: \"kubernetes.io/projected/de894f99-4158-4096-b100-4758130c6c12-kube-api-access-7mb62\") pod \"cert-manager-webhook-5655c58dd6-nwld2\" (UID: \"de894f99-4158-4096-b100-4758130c6c12\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-nwld2" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.632250 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p55cm\" (UniqueName: \"kubernetes.io/projected/6b09ecf0-40b3-4271-97da-a662a4b427d6-kube-api-access-p55cm\") pod \"cert-manager-cainjector-7f985d654d-njkbq\" (UID: \"6b09ecf0-40b3-4271-97da-a662a4b427d6\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-njkbq" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.632309 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtl6q\" (UniqueName: \"kubernetes.io/projected/210c03bc-36ef-4bc0-ba17-db783a56d470-kube-api-access-vtl6q\") pod \"cert-manager-5b446d88c5-78xzp\" (UID: \"210c03bc-36ef-4bc0-ba17-db783a56d470\") " pod="cert-manager/cert-manager-5b446d88c5-78xzp" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.652700 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtl6q\" (UniqueName: \"kubernetes.io/projected/210c03bc-36ef-4bc0-ba17-db783a56d470-kube-api-access-vtl6q\") pod \"cert-manager-5b446d88c5-78xzp\" (UID: \"210c03bc-36ef-4bc0-ba17-db783a56d470\") " pod="cert-manager/cert-manager-5b446d88c5-78xzp" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.653608 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p55cm\" (UniqueName: \"kubernetes.io/projected/6b09ecf0-40b3-4271-97da-a662a4b427d6-kube-api-access-p55cm\") pod \"cert-manager-cainjector-7f985d654d-njkbq\" (UID: \"6b09ecf0-40b3-4271-97da-a662a4b427d6\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-njkbq" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.657837 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mb62\" (UniqueName: \"kubernetes.io/projected/de894f99-4158-4096-b100-4758130c6c12-kube-api-access-7mb62\") pod \"cert-manager-webhook-5655c58dd6-nwld2\" (UID: \"de894f99-4158-4096-b100-4758130c6c12\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-nwld2" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.707658 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-njkbq" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.716187 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-78xzp" Dec 01 09:20:14 crc kubenswrapper[4867]: I1201 09:20:14.752452 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-nwld2" Dec 01 09:20:15 crc kubenswrapper[4867]: I1201 09:20:15.087100 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-njkbq"] Dec 01 09:20:15 crc kubenswrapper[4867]: I1201 09:20:15.094433 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:20:15 crc kubenswrapper[4867]: I1201 09:20:15.148076 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-78xzp"] Dec 01 09:20:15 crc kubenswrapper[4867]: W1201 09:20:15.156224 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod210c03bc_36ef_4bc0_ba17_db783a56d470.slice/crio-d6c4cd9088473da33d3892a7e626e0ab59594075adffc3e8b94e7a22e8856382 WatchSource:0}: Error finding container d6c4cd9088473da33d3892a7e626e0ab59594075adffc3e8b94e7a22e8856382: Status 404 returned error can't find the container with id d6c4cd9088473da33d3892a7e626e0ab59594075adffc3e8b94e7a22e8856382 Dec 01 09:20:15 crc kubenswrapper[4867]: I1201 09:20:15.400877 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-nwld2"] Dec 01 09:20:15 crc kubenswrapper[4867]: I1201 09:20:15.501391 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-njkbq" event={"ID":"6b09ecf0-40b3-4271-97da-a662a4b427d6","Type":"ContainerStarted","Data":"b4f7c77212e23713a285f23d3ff9b209380fcf052206c95932e3d7c9b8bebe2d"} Dec 01 09:20:15 crc kubenswrapper[4867]: I1201 09:20:15.502702 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-78xzp" event={"ID":"210c03bc-36ef-4bc0-ba17-db783a56d470","Type":"ContainerStarted","Data":"d6c4cd9088473da33d3892a7e626e0ab59594075adffc3e8b94e7a22e8856382"} Dec 01 09:20:15 crc kubenswrapper[4867]: I1201 09:20:15.503662 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-nwld2" event={"ID":"de894f99-4158-4096-b100-4758130c6c12","Type":"ContainerStarted","Data":"8f02dfd32532fc85adae1ae0305c33906941558813c4e552bfc47bdf935b14a6"} Dec 01 09:20:20 crc kubenswrapper[4867]: I1201 09:20:20.535017 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-nwld2" event={"ID":"de894f99-4158-4096-b100-4758130c6c12","Type":"ContainerStarted","Data":"03e23daa332891f27dab480ba9322e22e7033802ae873e6f533b9b80f410e937"} Dec 01 09:20:20 crc kubenswrapper[4867]: I1201 09:20:20.536640 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-nwld2" Dec 01 09:20:20 crc kubenswrapper[4867]: I1201 09:20:20.538042 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-njkbq" event={"ID":"6b09ecf0-40b3-4271-97da-a662a4b427d6","Type":"ContainerStarted","Data":"01236ddac63b6fa5bfd1f5a6b435bd2eff60f2a582fb02c57b886b6c8f4ef7ac"} Dec 01 09:20:20 crc kubenswrapper[4867]: I1201 09:20:20.540136 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-78xzp" event={"ID":"210c03bc-36ef-4bc0-ba17-db783a56d470","Type":"ContainerStarted","Data":"db2c82d86a976bec8936bb600a43ae3d69a8cdb833f377d243828749797fd101"} Dec 01 09:20:20 crc kubenswrapper[4867]: I1201 09:20:20.562089 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-nwld2" podStartSLOduration=2.331639823 podStartE2EDuration="6.562064603s" podCreationTimestamp="2025-12-01 09:20:14 +0000 UTC" firstStartedPulling="2025-12-01 09:20:15.412291962 +0000 UTC m=+736.871678716" lastFinishedPulling="2025-12-01 09:20:19.642716742 +0000 UTC m=+741.102103496" observedRunningTime="2025-12-01 09:20:20.558096273 +0000 UTC m=+742.017483037" watchObservedRunningTime="2025-12-01 09:20:20.562064603 +0000 UTC m=+742.021451367" Dec 01 09:20:20 crc kubenswrapper[4867]: I1201 09:20:20.577685 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-78xzp" podStartSLOduration=2.152414861 podStartE2EDuration="6.577659259s" podCreationTimestamp="2025-12-01 09:20:14 +0000 UTC" firstStartedPulling="2025-12-01 09:20:15.159926068 +0000 UTC m=+736.619312822" lastFinishedPulling="2025-12-01 09:20:19.585170466 +0000 UTC m=+741.044557220" observedRunningTime="2025-12-01 09:20:20.574882231 +0000 UTC m=+742.034269005" watchObservedRunningTime="2025-12-01 09:20:20.577659259 +0000 UTC m=+742.037046013" Dec 01 09:20:21 crc kubenswrapper[4867]: I1201 09:20:21.601832 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:20:21 crc kubenswrapper[4867]: I1201 09:20:21.601928 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:20:23 crc kubenswrapper[4867]: I1201 09:20:23.906089 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-njkbq" podStartSLOduration=5.416696743 podStartE2EDuration="9.906058092s" podCreationTimestamp="2025-12-01 09:20:14 +0000 UTC" firstStartedPulling="2025-12-01 09:20:15.094229933 +0000 UTC m=+736.553616687" lastFinishedPulling="2025-12-01 09:20:19.583591262 +0000 UTC m=+741.042978036" observedRunningTime="2025-12-01 09:20:20.61137483 +0000 UTC m=+742.070761604" watchObservedRunningTime="2025-12-01 09:20:23.906058092 +0000 UTC m=+745.365444856" Dec 01 09:20:23 crc kubenswrapper[4867]: I1201 09:20:23.907214 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kk2hn"] Dec 01 09:20:23 crc kubenswrapper[4867]: I1201 09:20:23.907733 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovn-controller" containerID="cri-o://c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6" gracePeriod=30 Dec 01 09:20:23 crc kubenswrapper[4867]: I1201 09:20:23.907829 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="nbdb" containerID="cri-o://22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe" gracePeriod=30 Dec 01 09:20:23 crc kubenswrapper[4867]: I1201 09:20:23.907968 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovn-acl-logging" containerID="cri-o://4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e" gracePeriod=30 Dec 01 09:20:23 crc kubenswrapper[4867]: I1201 09:20:23.907903 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="sbdb" containerID="cri-o://1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2" gracePeriod=30 Dec 01 09:20:23 crc kubenswrapper[4867]: I1201 09:20:23.907874 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="northd" containerID="cri-o://e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe" gracePeriod=30 Dec 01 09:20:23 crc kubenswrapper[4867]: I1201 09:20:23.907951 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="kube-rbac-proxy-node" containerID="cri-o://185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3" gracePeriod=30 Dec 01 09:20:23 crc kubenswrapper[4867]: I1201 09:20:23.907843 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd" gracePeriod=30 Dec 01 09:20:23 crc kubenswrapper[4867]: I1201 09:20:23.955273 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovnkube-controller" containerID="cri-o://cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d" gracePeriod=30 Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.220869 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kk2hn_8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/ovnkube-controller/3.log" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.223739 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kk2hn_8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/ovn-acl-logging/0.log" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.224289 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kk2hn_8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/ovn-controller/0.log" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.224726 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.282500 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-csprr"] Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.282687 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-env-overrides\") pod \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.282718 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-systemd-units\") pod \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.282765 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-log-socket\") pod \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.282787 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-ovnkube-script-lib\") pod \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.282829 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-run-netns\") pod \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.282849 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-run-ovn\") pod \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.282871 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-node-log\") pod \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.282897 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-cni-bin\") pod \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.282891 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" (UID: "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.282927 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovn-acl-logging" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.282919 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-run-ovn-kubernetes\") pod \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.282969 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" (UID: "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.282942 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" (UID: "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.282947 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovn-acl-logging" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283002 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-node-log" (OuterVolumeSpecName: "node-log") pod "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" (UID: "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283006 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-run-openvswitch\") pod \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283004 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-log-socket" (OuterVolumeSpecName: "log-socket") pod "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" (UID: "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283021 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" (UID: "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.283023 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovnkube-controller" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283041 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4925\" (UniqueName: \"kubernetes.io/projected/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-kube-api-access-g4925\") pod \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283058 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" (UID: "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283059 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" (UID: "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283044 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovnkube-controller" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.283089 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="northd" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283098 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="northd" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.283111 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="nbdb" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283075 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-slash\") pod \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283119 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="nbdb" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283141 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-etc-openvswitch\") pod \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.283152 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="kubecfg-setup" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283160 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="kubecfg-setup" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.283167 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovn-controller" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283173 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovn-controller" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.283193 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283199 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.283207 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="sbdb" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283207 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-cni-netd\") pod \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283213 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="sbdb" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283250 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-ovnkube-config\") pod \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.283270 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="kube-rbac-proxy-node" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283277 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-run-systemd\") pod \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283280 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="kube-rbac-proxy-node" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.283315 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovnkube-controller" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283325 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovnkube-controller" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.283337 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovnkube-controller" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283346 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovnkube-controller" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283097 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-slash" (OuterVolumeSpecName: "host-slash") pod "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" (UID: "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283235 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" (UID: "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283236 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" (UID: "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283290 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" (UID: "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283473 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" (UID: "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283556 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovnkube-controller" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283571 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="nbdb" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283582 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="northd" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283582 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" (UID: "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283593 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="sbdb" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283602 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovn-controller" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283615 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovn-acl-logging" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283626 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283635 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovnkube-controller" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283645 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovnkube-controller" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283653 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="kube-rbac-proxy-node" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283662 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovnkube-controller" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.283777 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovnkube-controller" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283786 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovnkube-controller" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.283801 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovnkube-controller" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283824 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovnkube-controller" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.283926 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerName="ovnkube-controller" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.285582 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.285930 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-var-lib-openvswitch\") pod \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.285959 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" (UID: "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.285990 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-var-lib-cni-networks-ovn-kubernetes\") pod \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286009 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-kubelet\") pod \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286035 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-ovn-node-metrics-cert\") pod \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\" (UID: \"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32\") " Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286099 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" (UID: "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286124 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" (UID: "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286392 4867 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-log-socket\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286413 4867 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286423 4867 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286432 4867 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286439 4867 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-node-log\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286447 4867 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286456 4867 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286465 4867 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286473 4867 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-slash\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286480 4867 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286488 4867 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286496 4867 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286504 4867 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286512 4867 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286520 4867 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286528 4867 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.286535 4867 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.289723 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" (UID: "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.290199 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-kube-api-access-g4925" (OuterVolumeSpecName: "kube-api-access-g4925") pod "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" (UID: "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32"). InnerVolumeSpecName "kube-api-access-g4925". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.295728 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" (UID: "8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.388062 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/68586ce4-61eb-47d1-9d14-a7ce868ccff4-ovnkube-config\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.388458 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxd5k\" (UniqueName: \"kubernetes.io/projected/68586ce4-61eb-47d1-9d14-a7ce868ccff4-kube-api-access-dxd5k\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.388491 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-var-lib-openvswitch\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.388513 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-etc-openvswitch\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.388526 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-run-ovn-kubernetes\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.388541 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-cni-netd\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.388611 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-run-openvswitch\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.388678 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-kubelet\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.388705 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-systemd-units\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.388730 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-run-systemd\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.388748 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/68586ce4-61eb-47d1-9d14-a7ce868ccff4-env-overrides\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.388773 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/68586ce4-61eb-47d1-9d14-a7ce868ccff4-ovnkube-script-lib\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.388797 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-run-netns\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.388826 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-log-socket\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.388844 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.388869 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-run-ovn\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.388908 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-cni-bin\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.388932 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-slash\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.388948 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-node-log\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.388968 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/68586ce4-61eb-47d1-9d14-a7ce868ccff4-ovn-node-metrics-cert\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.389027 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4925\" (UniqueName: \"kubernetes.io/projected/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-kube-api-access-g4925\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.389039 4867 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.389050 4867 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.489693 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-kubelet\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.489745 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-systemd-units\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.489773 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-run-systemd\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.489794 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/68586ce4-61eb-47d1-9d14-a7ce868ccff4-env-overrides\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.489835 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-kubelet\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.489839 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/68586ce4-61eb-47d1-9d14-a7ce868ccff4-ovnkube-script-lib\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.489872 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-run-netns\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.489895 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-log-socket\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.489915 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.489937 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-run-ovn\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.489969 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-cni-bin\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.489980 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-run-systemd\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490021 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-systemd-units\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490064 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-log-socket\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.489993 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-slash\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490088 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-run-netns\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490142 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490108 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-run-ovn\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490173 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-node-log\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490183 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-node-log\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490215 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-cni-bin\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490222 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/68586ce4-61eb-47d1-9d14-a7ce868ccff4-ovn-node-metrics-cert\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490320 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/68586ce4-61eb-47d1-9d14-a7ce868ccff4-ovnkube-config\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490365 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxd5k\" (UniqueName: \"kubernetes.io/projected/68586ce4-61eb-47d1-9d14-a7ce868ccff4-kube-api-access-dxd5k\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490409 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-var-lib-openvswitch\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490461 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/68586ce4-61eb-47d1-9d14-a7ce868ccff4-env-overrides\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490506 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-etc-openvswitch\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490467 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-etc-openvswitch\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490544 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-run-ovn-kubernetes\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490567 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-cni-netd\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490596 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-run-openvswitch\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490640 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/68586ce4-61eb-47d1-9d14-a7ce868ccff4-ovnkube-script-lib\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490660 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-run-openvswitch\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490692 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-run-ovn-kubernetes\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490722 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-cni-netd\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490754 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-var-lib-openvswitch\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.490223 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/68586ce4-61eb-47d1-9d14-a7ce868ccff4-host-slash\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.491540 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/68586ce4-61eb-47d1-9d14-a7ce868ccff4-ovnkube-config\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.495026 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/68586ce4-61eb-47d1-9d14-a7ce868ccff4-ovn-node-metrics-cert\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.511225 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxd5k\" (UniqueName: \"kubernetes.io/projected/68586ce4-61eb-47d1-9d14-a7ce868ccff4-kube-api-access-dxd5k\") pod \"ovnkube-node-csprr\" (UID: \"68586ce4-61eb-47d1-9d14-a7ce868ccff4\") " pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.566640 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tj9fl_c813b7ba-4c04-44d0-9f3e-3e5f4897fb73/kube-multus/2.log" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.567573 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tj9fl_c813b7ba-4c04-44d0-9f3e-3e5f4897fb73/kube-multus/1.log" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.567630 4867 generic.go:334] "Generic (PLEG): container finished" podID="c813b7ba-4c04-44d0-9f3e-3e5f4897fb73" containerID="dcafca89c954d08442575fb0289c167ad5b6c82180418cf7ed2a2e73f47682cb" exitCode=2 Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.567719 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tj9fl" event={"ID":"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73","Type":"ContainerDied","Data":"dcafca89c954d08442575fb0289c167ad5b6c82180418cf7ed2a2e73f47682cb"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.567759 4867 scope.go:117] "RemoveContainer" containerID="5e6b9eb6ccecedc334e31540ce3540114856b189aa31e762086629439a0dab9b" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.570069 4867 scope.go:117] "RemoveContainer" containerID="dcafca89c954d08442575fb0289c167ad5b6c82180418cf7ed2a2e73f47682cb" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.570696 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kk2hn_8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/ovnkube-controller/3.log" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.573752 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kk2hn_8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/ovn-acl-logging/0.log" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.574305 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kk2hn_8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/ovn-controller/0.log" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.575900 4867 generic.go:334] "Generic (PLEG): container finished" podID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerID="cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d" exitCode=0 Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.575922 4867 generic.go:334] "Generic (PLEG): container finished" podID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerID="1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2" exitCode=0 Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.575930 4867 generic.go:334] "Generic (PLEG): container finished" podID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerID="22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe" exitCode=0 Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.575937 4867 generic.go:334] "Generic (PLEG): container finished" podID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerID="e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe" exitCode=0 Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.575945 4867 generic.go:334] "Generic (PLEG): container finished" podID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerID="669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd" exitCode=0 Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.575952 4867 generic.go:334] "Generic (PLEG): container finished" podID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerID="185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3" exitCode=0 Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.575960 4867 generic.go:334] "Generic (PLEG): container finished" podID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerID="4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e" exitCode=143 Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.575968 4867 generic.go:334] "Generic (PLEG): container finished" podID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" containerID="c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6" exitCode=143 Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.575988 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerDied","Data":"cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576015 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerDied","Data":"1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576028 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerDied","Data":"22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576041 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerDied","Data":"e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576051 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerDied","Data":"669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576064 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerDied","Data":"185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576076 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576087 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576094 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576100 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576106 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576112 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576118 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576124 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576130 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576137 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576144 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerDied","Data":"4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576156 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576164 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576170 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576177 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576183 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576189 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576195 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576200 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576206 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576212 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576220 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerDied","Data":"c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576229 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576236 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576242 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576249 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576256 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576263 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576269 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576274 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576280 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576286 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576294 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" event={"ID":"8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32","Type":"ContainerDied","Data":"17bda87c2e237e71656257748c4d985a9fa6c1fe6585e62964de5ff46ae13308"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576305 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576313 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576319 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576325 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576331 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576337 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576342 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576348 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576355 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576361 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c"} Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.576447 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kk2hn" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.602776 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.605430 4867 scope.go:117] "RemoveContainer" containerID="cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.629549 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kk2hn"] Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.633274 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kk2hn"] Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.639290 4867 scope.go:117] "RemoveContainer" containerID="83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.674536 4867 scope.go:117] "RemoveContainer" containerID="1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.694949 4867 scope.go:117] "RemoveContainer" containerID="22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.717432 4867 scope.go:117] "RemoveContainer" containerID="e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.733450 4867 scope.go:117] "RemoveContainer" containerID="669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.749346 4867 scope.go:117] "RemoveContainer" containerID="185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.754784 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-nwld2" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.762883 4867 scope.go:117] "RemoveContainer" containerID="4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.792380 4867 scope.go:117] "RemoveContainer" containerID="c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.833391 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32" path="/var/lib/kubelet/pods/8f21a2a8-1fd5-4a00-a8ac-02c1f24a3f32/volumes" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.834994 4867 scope.go:117] "RemoveContainer" containerID="44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.846176 4867 scope.go:117] "RemoveContainer" containerID="cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.846498 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d\": container with ID starting with cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d not found: ID does not exist" containerID="cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.846525 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d"} err="failed to get container status \"cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d\": rpc error: code = NotFound desc = could not find container \"cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d\": container with ID starting with cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.846542 4867 scope.go:117] "RemoveContainer" containerID="83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.846895 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3\": container with ID starting with 83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3 not found: ID does not exist" containerID="83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.846938 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3"} err="failed to get container status \"83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3\": rpc error: code = NotFound desc = could not find container \"83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3\": container with ID starting with 83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3 not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.846964 4867 scope.go:117] "RemoveContainer" containerID="1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.847312 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\": container with ID starting with 1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2 not found: ID does not exist" containerID="1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.847357 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2"} err="failed to get container status \"1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\": rpc error: code = NotFound desc = could not find container \"1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\": container with ID starting with 1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2 not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.847389 4867 scope.go:117] "RemoveContainer" containerID="22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.847721 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\": container with ID starting with 22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe not found: ID does not exist" containerID="22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.847743 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe"} err="failed to get container status \"22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\": rpc error: code = NotFound desc = could not find container \"22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\": container with ID starting with 22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.847757 4867 scope.go:117] "RemoveContainer" containerID="e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.848010 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\": container with ID starting with e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe not found: ID does not exist" containerID="e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.848037 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe"} err="failed to get container status \"e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\": rpc error: code = NotFound desc = could not find container \"e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\": container with ID starting with e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.848055 4867 scope.go:117] "RemoveContainer" containerID="669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.848257 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\": container with ID starting with 669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd not found: ID does not exist" containerID="669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.848278 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd"} err="failed to get container status \"669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\": rpc error: code = NotFound desc = could not find container \"669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\": container with ID starting with 669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.848291 4867 scope.go:117] "RemoveContainer" containerID="185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.848668 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\": container with ID starting with 185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3 not found: ID does not exist" containerID="185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.848686 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3"} err="failed to get container status \"185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\": rpc error: code = NotFound desc = could not find container \"185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\": container with ID starting with 185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3 not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.848698 4867 scope.go:117] "RemoveContainer" containerID="4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.848988 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\": container with ID starting with 4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e not found: ID does not exist" containerID="4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.849006 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e"} err="failed to get container status \"4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\": rpc error: code = NotFound desc = could not find container \"4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\": container with ID starting with 4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.849018 4867 scope.go:117] "RemoveContainer" containerID="c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.849254 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\": container with ID starting with c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6 not found: ID does not exist" containerID="c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.849284 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6"} err="failed to get container status \"c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\": rpc error: code = NotFound desc = could not find container \"c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\": container with ID starting with c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6 not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.849304 4867 scope.go:117] "RemoveContainer" containerID="44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c" Dec 01 09:20:24 crc kubenswrapper[4867]: E1201 09:20:24.849551 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\": container with ID starting with 44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c not found: ID does not exist" containerID="44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.849580 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c"} err="failed to get container status \"44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\": rpc error: code = NotFound desc = could not find container \"44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\": container with ID starting with 44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.849598 4867 scope.go:117] "RemoveContainer" containerID="cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.849908 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d"} err="failed to get container status \"cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d\": rpc error: code = NotFound desc = could not find container \"cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d\": container with ID starting with cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.849932 4867 scope.go:117] "RemoveContainer" containerID="83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.850227 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3"} err="failed to get container status \"83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3\": rpc error: code = NotFound desc = could not find container \"83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3\": container with ID starting with 83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3 not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.850246 4867 scope.go:117] "RemoveContainer" containerID="1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.850538 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2"} err="failed to get container status \"1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\": rpc error: code = NotFound desc = could not find container \"1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\": container with ID starting with 1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2 not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.850574 4867 scope.go:117] "RemoveContainer" containerID="22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.850870 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe"} err="failed to get container status \"22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\": rpc error: code = NotFound desc = could not find container \"22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\": container with ID starting with 22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.850891 4867 scope.go:117] "RemoveContainer" containerID="e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.851221 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe"} err="failed to get container status \"e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\": rpc error: code = NotFound desc = could not find container \"e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\": container with ID starting with e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.851239 4867 scope.go:117] "RemoveContainer" containerID="669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.851516 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd"} err="failed to get container status \"669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\": rpc error: code = NotFound desc = could not find container \"669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\": container with ID starting with 669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.851536 4867 scope.go:117] "RemoveContainer" containerID="185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.851791 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3"} err="failed to get container status \"185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\": rpc error: code = NotFound desc = could not find container \"185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\": container with ID starting with 185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3 not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.851829 4867 scope.go:117] "RemoveContainer" containerID="4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.852125 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e"} err="failed to get container status \"4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\": rpc error: code = NotFound desc = could not find container \"4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\": container with ID starting with 4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.852143 4867 scope.go:117] "RemoveContainer" containerID="c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.852379 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6"} err="failed to get container status \"c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\": rpc error: code = NotFound desc = could not find container \"c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\": container with ID starting with c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6 not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.852397 4867 scope.go:117] "RemoveContainer" containerID="44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.852596 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c"} err="failed to get container status \"44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\": rpc error: code = NotFound desc = could not find container \"44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\": container with ID starting with 44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.852620 4867 scope.go:117] "RemoveContainer" containerID="cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.852932 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d"} err="failed to get container status \"cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d\": rpc error: code = NotFound desc = could not find container \"cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d\": container with ID starting with cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.852954 4867 scope.go:117] "RemoveContainer" containerID="83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.853247 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3"} err="failed to get container status \"83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3\": rpc error: code = NotFound desc = could not find container \"83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3\": container with ID starting with 83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3 not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.853267 4867 scope.go:117] "RemoveContainer" containerID="1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.853559 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2"} err="failed to get container status \"1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\": rpc error: code = NotFound desc = could not find container \"1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\": container with ID starting with 1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2 not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.853584 4867 scope.go:117] "RemoveContainer" containerID="22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.853976 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe"} err="failed to get container status \"22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\": rpc error: code = NotFound desc = could not find container \"22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\": container with ID starting with 22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.853996 4867 scope.go:117] "RemoveContainer" containerID="e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.854330 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe"} err="failed to get container status \"e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\": rpc error: code = NotFound desc = could not find container \"e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\": container with ID starting with e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.854356 4867 scope.go:117] "RemoveContainer" containerID="669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.854689 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd"} err="failed to get container status \"669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\": rpc error: code = NotFound desc = could not find container \"669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\": container with ID starting with 669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.854708 4867 scope.go:117] "RemoveContainer" containerID="185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.855045 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3"} err="failed to get container status \"185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\": rpc error: code = NotFound desc = could not find container \"185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\": container with ID starting with 185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3 not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.855066 4867 scope.go:117] "RemoveContainer" containerID="4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.855373 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e"} err="failed to get container status \"4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\": rpc error: code = NotFound desc = could not find container \"4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\": container with ID starting with 4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.855398 4867 scope.go:117] "RemoveContainer" containerID="c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.855683 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6"} err="failed to get container status \"c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\": rpc error: code = NotFound desc = could not find container \"c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\": container with ID starting with c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6 not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.855705 4867 scope.go:117] "RemoveContainer" containerID="44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.855922 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c"} err="failed to get container status \"44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\": rpc error: code = NotFound desc = could not find container \"44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\": container with ID starting with 44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.855944 4867 scope.go:117] "RemoveContainer" containerID="cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.856312 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d"} err="failed to get container status \"cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d\": rpc error: code = NotFound desc = could not find container \"cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d\": container with ID starting with cd0a71658ce816ab8947414895094f250936bf0db5cb3a1508975dbbe80cd62d not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.856334 4867 scope.go:117] "RemoveContainer" containerID="83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.856642 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3"} err="failed to get container status \"83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3\": rpc error: code = NotFound desc = could not find container \"83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3\": container with ID starting with 83b75ca695c9f1f4001ae057b580f7ef61eb7cdce5f2be1abe39b469f05dabc3 not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.856663 4867 scope.go:117] "RemoveContainer" containerID="1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.857066 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2"} err="failed to get container status \"1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\": rpc error: code = NotFound desc = could not find container \"1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2\": container with ID starting with 1d41684bc2c38abb4645e8df08194e968e38ba72a21db3b5a02d69bbb71004d2 not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.857086 4867 scope.go:117] "RemoveContainer" containerID="22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.857428 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe"} err="failed to get container status \"22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\": rpc error: code = NotFound desc = could not find container \"22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe\": container with ID starting with 22eb1cb36d3927930e3d09b367d47cb2804831fc9d9cf60f09ec49dff80ebafe not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.857449 4867 scope.go:117] "RemoveContainer" containerID="e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.857682 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe"} err="failed to get container status \"e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\": rpc error: code = NotFound desc = could not find container \"e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe\": container with ID starting with e602a3f75833dae8542393da4b90812f6fb4a28c7e4047e32f5892b8660371fe not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.857702 4867 scope.go:117] "RemoveContainer" containerID="669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.858050 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd"} err="failed to get container status \"669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\": rpc error: code = NotFound desc = could not find container \"669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd\": container with ID starting with 669f334eff2f84abc7424af26b753f6173e8f26dd911e382cfc1713e6caa62bd not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.858075 4867 scope.go:117] "RemoveContainer" containerID="185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.858380 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3"} err="failed to get container status \"185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\": rpc error: code = NotFound desc = could not find container \"185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3\": container with ID starting with 185cc88c3b730ca7845f4591d42b818bf3ed24af092135962d3be0c959a7f2b3 not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.858397 4867 scope.go:117] "RemoveContainer" containerID="4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.858696 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e"} err="failed to get container status \"4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\": rpc error: code = NotFound desc = could not find container \"4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e\": container with ID starting with 4e1247f67243efd2fe72403f0fd8ad95d22dd207782781f63845e3e74367aa6e not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.858723 4867 scope.go:117] "RemoveContainer" containerID="c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.859041 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6"} err="failed to get container status \"c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\": rpc error: code = NotFound desc = could not find container \"c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6\": container with ID starting with c34508ee3a0e98985eb8c7cac6c7adb78bf5fb3ad8a885f0dd5baa77f3220be6 not found: ID does not exist" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.859058 4867 scope.go:117] "RemoveContainer" containerID="44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c" Dec 01 09:20:24 crc kubenswrapper[4867]: I1201 09:20:24.859374 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c"} err="failed to get container status \"44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\": rpc error: code = NotFound desc = could not find container \"44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c\": container with ID starting with 44c19592f50c5c65cc0509811f54a4dc18ac297a7c6e798fc4c9eba453ab525c not found: ID does not exist" Dec 01 09:20:25 crc kubenswrapper[4867]: I1201 09:20:25.590935 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tj9fl_c813b7ba-4c04-44d0-9f3e-3e5f4897fb73/kube-multus/2.log" Dec 01 09:20:25 crc kubenswrapper[4867]: I1201 09:20:25.591361 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tj9fl" event={"ID":"c813b7ba-4c04-44d0-9f3e-3e5f4897fb73","Type":"ContainerStarted","Data":"836af7b8caf76e5ac6e658f873c07b1a556b8c2dd630ffbd8d4f857bd2db9861"} Dec 01 09:20:25 crc kubenswrapper[4867]: I1201 09:20:25.596891 4867 generic.go:334] "Generic (PLEG): container finished" podID="68586ce4-61eb-47d1-9d14-a7ce868ccff4" containerID="69d7ea419e8edfe32bcf3668f15e52c534ae0a8e4ed3c2cafcc1de5d83b08ec7" exitCode=0 Dec 01 09:20:25 crc kubenswrapper[4867]: I1201 09:20:25.596956 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-csprr" event={"ID":"68586ce4-61eb-47d1-9d14-a7ce868ccff4","Type":"ContainerDied","Data":"69d7ea419e8edfe32bcf3668f15e52c534ae0a8e4ed3c2cafcc1de5d83b08ec7"} Dec 01 09:20:25 crc kubenswrapper[4867]: I1201 09:20:25.596994 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-csprr" event={"ID":"68586ce4-61eb-47d1-9d14-a7ce868ccff4","Type":"ContainerStarted","Data":"f5355abc41e1e88fe1dde37cf4693ec2cf878c382f33ac9037fb06532a64a24c"} Dec 01 09:20:26 crc kubenswrapper[4867]: I1201 09:20:26.605280 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-csprr" event={"ID":"68586ce4-61eb-47d1-9d14-a7ce868ccff4","Type":"ContainerStarted","Data":"e4c7df09aad170b52461fc68a5b7201db99a4dca03cb5c3bbbb90eb64963a3a0"} Dec 01 09:20:26 crc kubenswrapper[4867]: I1201 09:20:26.605647 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-csprr" event={"ID":"68586ce4-61eb-47d1-9d14-a7ce868ccff4","Type":"ContainerStarted","Data":"bdf29c5a5c90280f6bcab9a6e007b2d8276e6044713c97b8069b61dcd9d90f10"} Dec 01 09:20:26 crc kubenswrapper[4867]: I1201 09:20:26.605665 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-csprr" event={"ID":"68586ce4-61eb-47d1-9d14-a7ce868ccff4","Type":"ContainerStarted","Data":"0160b7d537ee0159b9c50db82bf3c858dc6764dfd77e8ec35dc3b98ca94508da"} Dec 01 09:20:26 crc kubenswrapper[4867]: I1201 09:20:26.605677 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-csprr" event={"ID":"68586ce4-61eb-47d1-9d14-a7ce868ccff4","Type":"ContainerStarted","Data":"e2753a91126a147a08a6385bda4008b097312970d6313c6fe3116a86039631ab"} Dec 01 09:20:26 crc kubenswrapper[4867]: I1201 09:20:26.605689 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-csprr" event={"ID":"68586ce4-61eb-47d1-9d14-a7ce868ccff4","Type":"ContainerStarted","Data":"ff288d076cd1bfb3e912c137c25d6078d43d764ad9daac007817d7dae6591510"} Dec 01 09:20:26 crc kubenswrapper[4867]: I1201 09:20:26.605702 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-csprr" event={"ID":"68586ce4-61eb-47d1-9d14-a7ce868ccff4","Type":"ContainerStarted","Data":"1ac102080b92adb78ee10feac89c5d5a802a04c16c1f07af7d351cd8677c868a"} Dec 01 09:20:28 crc kubenswrapper[4867]: I1201 09:20:28.617803 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-csprr" event={"ID":"68586ce4-61eb-47d1-9d14-a7ce868ccff4","Type":"ContainerStarted","Data":"cb0fe5613871a1a1722835545a20d194607ab9a39534fa19ebd1f1f49f76e1aa"} Dec 01 09:20:32 crc kubenswrapper[4867]: I1201 09:20:32.277871 4867 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 09:20:32 crc kubenswrapper[4867]: I1201 09:20:32.640658 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-csprr" event={"ID":"68586ce4-61eb-47d1-9d14-a7ce868ccff4","Type":"ContainerStarted","Data":"14827c8b47d58cb659bdee02c9fd26f0cc7ad46034e90c8a0e64b09151a5bb04"} Dec 01 09:20:32 crc kubenswrapper[4867]: I1201 09:20:32.640977 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:32 crc kubenswrapper[4867]: I1201 09:20:32.640991 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:32 crc kubenswrapper[4867]: I1201 09:20:32.666243 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:32 crc kubenswrapper[4867]: I1201 09:20:32.671070 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-csprr" podStartSLOduration=8.671056154 podStartE2EDuration="8.671056154s" podCreationTimestamp="2025-12-01 09:20:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:20:32.664671305 +0000 UTC m=+754.124058059" watchObservedRunningTime="2025-12-01 09:20:32.671056154 +0000 UTC m=+754.130442908" Dec 01 09:20:33 crc kubenswrapper[4867]: I1201 09:20:33.645901 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:33 crc kubenswrapper[4867]: I1201 09:20:33.670419 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:20:51 crc kubenswrapper[4867]: I1201 09:20:51.601275 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:20:51 crc kubenswrapper[4867]: I1201 09:20:51.601933 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:20:51 crc kubenswrapper[4867]: I1201 09:20:51.602008 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:20:51 crc kubenswrapper[4867]: I1201 09:20:51.603009 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa8ac94dfac3773a1b35360216b60b2041c8fba117bde3b6f4dcf7bb4fc033b2"} pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:20:51 crc kubenswrapper[4867]: I1201 09:20:51.603152 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" containerID="cri-o://fa8ac94dfac3773a1b35360216b60b2041c8fba117bde3b6f4dcf7bb4fc033b2" gracePeriod=600 Dec 01 09:20:51 crc kubenswrapper[4867]: I1201 09:20:51.753523 4867 generic.go:334] "Generic (PLEG): container finished" podID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerID="fa8ac94dfac3773a1b35360216b60b2041c8fba117bde3b6f4dcf7bb4fc033b2" exitCode=0 Dec 01 09:20:51 crc kubenswrapper[4867]: I1201 09:20:51.753567 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerDied","Data":"fa8ac94dfac3773a1b35360216b60b2041c8fba117bde3b6f4dcf7bb4fc033b2"} Dec 01 09:20:51 crc kubenswrapper[4867]: I1201 09:20:51.753606 4867 scope.go:117] "RemoveContainer" containerID="0847c17cfa5036057c123c535bf976ed7bbb5b492abe611ef17929fa45cab386" Dec 01 09:20:52 crc kubenswrapper[4867]: I1201 09:20:52.764472 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"793e6afb196d113ee707de55107187443477607da81a9336611cc7c60ae9f91b"} Dec 01 09:20:54 crc kubenswrapper[4867]: I1201 09:20:54.627275 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-csprr" Dec 01 09:21:04 crc kubenswrapper[4867]: I1201 09:21:04.777364 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb"] Dec 01 09:21:04 crc kubenswrapper[4867]: I1201 09:21:04.778737 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb" Dec 01 09:21:04 crc kubenswrapper[4867]: I1201 09:21:04.780909 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 09:21:04 crc kubenswrapper[4867]: I1201 09:21:04.797143 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb"] Dec 01 09:21:04 crc kubenswrapper[4867]: I1201 09:21:04.898142 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eada6e99-1300-49a6-8732-f4b2024526dc-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb\" (UID: \"eada6e99-1300-49a6-8732-f4b2024526dc\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb" Dec 01 09:21:04 crc kubenswrapper[4867]: I1201 09:21:04.898216 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4tjs\" (UniqueName: \"kubernetes.io/projected/eada6e99-1300-49a6-8732-f4b2024526dc-kube-api-access-v4tjs\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb\" (UID: \"eada6e99-1300-49a6-8732-f4b2024526dc\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb" Dec 01 09:21:04 crc kubenswrapper[4867]: I1201 09:21:04.898250 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eada6e99-1300-49a6-8732-f4b2024526dc-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb\" (UID: \"eada6e99-1300-49a6-8732-f4b2024526dc\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb" Dec 01 09:21:04 crc kubenswrapper[4867]: I1201 09:21:04.999737 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4tjs\" (UniqueName: \"kubernetes.io/projected/eada6e99-1300-49a6-8732-f4b2024526dc-kube-api-access-v4tjs\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb\" (UID: \"eada6e99-1300-49a6-8732-f4b2024526dc\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb" Dec 01 09:21:04 crc kubenswrapper[4867]: I1201 09:21:04.999831 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eada6e99-1300-49a6-8732-f4b2024526dc-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb\" (UID: \"eada6e99-1300-49a6-8732-f4b2024526dc\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb" Dec 01 09:21:04 crc kubenswrapper[4867]: I1201 09:21:04.999922 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eada6e99-1300-49a6-8732-f4b2024526dc-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb\" (UID: \"eada6e99-1300-49a6-8732-f4b2024526dc\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb" Dec 01 09:21:05 crc kubenswrapper[4867]: I1201 09:21:05.000550 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eada6e99-1300-49a6-8732-f4b2024526dc-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb\" (UID: \"eada6e99-1300-49a6-8732-f4b2024526dc\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb" Dec 01 09:21:05 crc kubenswrapper[4867]: I1201 09:21:05.001000 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eada6e99-1300-49a6-8732-f4b2024526dc-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb\" (UID: \"eada6e99-1300-49a6-8732-f4b2024526dc\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb" Dec 01 09:21:05 crc kubenswrapper[4867]: I1201 09:21:05.021736 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4tjs\" (UniqueName: \"kubernetes.io/projected/eada6e99-1300-49a6-8732-f4b2024526dc-kube-api-access-v4tjs\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb\" (UID: \"eada6e99-1300-49a6-8732-f4b2024526dc\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb" Dec 01 09:21:05 crc kubenswrapper[4867]: I1201 09:21:05.092437 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb" Dec 01 09:21:05 crc kubenswrapper[4867]: I1201 09:21:05.290746 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb"] Dec 01 09:21:05 crc kubenswrapper[4867]: W1201 09:21:05.295509 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeada6e99_1300_49a6_8732_f4b2024526dc.slice/crio-02ba901612d893203db6047cd8827e441f9a41d80c24b261bcc5c64a7f188731 WatchSource:0}: Error finding container 02ba901612d893203db6047cd8827e441f9a41d80c24b261bcc5c64a7f188731: Status 404 returned error can't find the container with id 02ba901612d893203db6047cd8827e441f9a41d80c24b261bcc5c64a7f188731 Dec 01 09:21:05 crc kubenswrapper[4867]: I1201 09:21:05.837381 4867 generic.go:334] "Generic (PLEG): container finished" podID="eada6e99-1300-49a6-8732-f4b2024526dc" containerID="d2daac29fc863d570c06b10757d4f935e4f4b594fc942622235a893188290a9d" exitCode=0 Dec 01 09:21:05 crc kubenswrapper[4867]: I1201 09:21:05.837487 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb" event={"ID":"eada6e99-1300-49a6-8732-f4b2024526dc","Type":"ContainerDied","Data":"d2daac29fc863d570c06b10757d4f935e4f4b594fc942622235a893188290a9d"} Dec 01 09:21:05 crc kubenswrapper[4867]: I1201 09:21:05.837667 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb" event={"ID":"eada6e99-1300-49a6-8732-f4b2024526dc","Type":"ContainerStarted","Data":"02ba901612d893203db6047cd8827e441f9a41d80c24b261bcc5c64a7f188731"} Dec 01 09:21:06 crc kubenswrapper[4867]: I1201 09:21:06.968566 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d6g9c"] Dec 01 09:21:06 crc kubenswrapper[4867]: I1201 09:21:06.969550 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d6g9c" Dec 01 09:21:06 crc kubenswrapper[4867]: I1201 09:21:06.989384 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d6g9c"] Dec 01 09:21:07 crc kubenswrapper[4867]: I1201 09:21:07.029651 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f764cf1-fc27-439c-8c61-9bc7e241d1fa-utilities\") pod \"redhat-operators-d6g9c\" (UID: \"1f764cf1-fc27-439c-8c61-9bc7e241d1fa\") " pod="openshift-marketplace/redhat-operators-d6g9c" Dec 01 09:21:07 crc kubenswrapper[4867]: I1201 09:21:07.029718 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr9bn\" (UniqueName: \"kubernetes.io/projected/1f764cf1-fc27-439c-8c61-9bc7e241d1fa-kube-api-access-sr9bn\") pod \"redhat-operators-d6g9c\" (UID: \"1f764cf1-fc27-439c-8c61-9bc7e241d1fa\") " pod="openshift-marketplace/redhat-operators-d6g9c" Dec 01 09:21:07 crc kubenswrapper[4867]: I1201 09:21:07.029800 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f764cf1-fc27-439c-8c61-9bc7e241d1fa-catalog-content\") pod \"redhat-operators-d6g9c\" (UID: \"1f764cf1-fc27-439c-8c61-9bc7e241d1fa\") " pod="openshift-marketplace/redhat-operators-d6g9c" Dec 01 09:21:07 crc kubenswrapper[4867]: I1201 09:21:07.131456 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f764cf1-fc27-439c-8c61-9bc7e241d1fa-catalog-content\") pod \"redhat-operators-d6g9c\" (UID: \"1f764cf1-fc27-439c-8c61-9bc7e241d1fa\") " pod="openshift-marketplace/redhat-operators-d6g9c" Dec 01 09:21:07 crc kubenswrapper[4867]: I1201 09:21:07.131544 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f764cf1-fc27-439c-8c61-9bc7e241d1fa-utilities\") pod \"redhat-operators-d6g9c\" (UID: \"1f764cf1-fc27-439c-8c61-9bc7e241d1fa\") " pod="openshift-marketplace/redhat-operators-d6g9c" Dec 01 09:21:07 crc kubenswrapper[4867]: I1201 09:21:07.131581 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr9bn\" (UniqueName: \"kubernetes.io/projected/1f764cf1-fc27-439c-8c61-9bc7e241d1fa-kube-api-access-sr9bn\") pod \"redhat-operators-d6g9c\" (UID: \"1f764cf1-fc27-439c-8c61-9bc7e241d1fa\") " pod="openshift-marketplace/redhat-operators-d6g9c" Dec 01 09:21:07 crc kubenswrapper[4867]: I1201 09:21:07.132031 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f764cf1-fc27-439c-8c61-9bc7e241d1fa-catalog-content\") pod \"redhat-operators-d6g9c\" (UID: \"1f764cf1-fc27-439c-8c61-9bc7e241d1fa\") " pod="openshift-marketplace/redhat-operators-d6g9c" Dec 01 09:21:07 crc kubenswrapper[4867]: I1201 09:21:07.132122 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f764cf1-fc27-439c-8c61-9bc7e241d1fa-utilities\") pod \"redhat-operators-d6g9c\" (UID: \"1f764cf1-fc27-439c-8c61-9bc7e241d1fa\") " pod="openshift-marketplace/redhat-operators-d6g9c" Dec 01 09:21:07 crc kubenswrapper[4867]: I1201 09:21:07.153237 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr9bn\" (UniqueName: \"kubernetes.io/projected/1f764cf1-fc27-439c-8c61-9bc7e241d1fa-kube-api-access-sr9bn\") pod \"redhat-operators-d6g9c\" (UID: \"1f764cf1-fc27-439c-8c61-9bc7e241d1fa\") " pod="openshift-marketplace/redhat-operators-d6g9c" Dec 01 09:21:07 crc kubenswrapper[4867]: I1201 09:21:07.285629 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d6g9c" Dec 01 09:21:07 crc kubenswrapper[4867]: I1201 09:21:07.519923 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d6g9c"] Dec 01 09:21:07 crc kubenswrapper[4867]: W1201 09:21:07.526563 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f764cf1_fc27_439c_8c61_9bc7e241d1fa.slice/crio-8ac790cccc7ab1047467dc698e03f04bc9a76e2625eed39f11ff2e59014b984c WatchSource:0}: Error finding container 8ac790cccc7ab1047467dc698e03f04bc9a76e2625eed39f11ff2e59014b984c: Status 404 returned error can't find the container with id 8ac790cccc7ab1047467dc698e03f04bc9a76e2625eed39f11ff2e59014b984c Dec 01 09:21:07 crc kubenswrapper[4867]: I1201 09:21:07.846112 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6g9c" event={"ID":"1f764cf1-fc27-439c-8c61-9bc7e241d1fa","Type":"ContainerStarted","Data":"8ac790cccc7ab1047467dc698e03f04bc9a76e2625eed39f11ff2e59014b984c"} Dec 01 09:21:09 crc kubenswrapper[4867]: I1201 09:21:09.520620 4867 generic.go:334] "Generic (PLEG): container finished" podID="eada6e99-1300-49a6-8732-f4b2024526dc" containerID="6a7f5929cbc603e3527df1f9712b00def6fca3badbd14afebd447a180afd5a27" exitCode=0 Dec 01 09:21:09 crc kubenswrapper[4867]: I1201 09:21:09.520721 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb" event={"ID":"eada6e99-1300-49a6-8732-f4b2024526dc","Type":"ContainerDied","Data":"6a7f5929cbc603e3527df1f9712b00def6fca3badbd14afebd447a180afd5a27"} Dec 01 09:21:09 crc kubenswrapper[4867]: I1201 09:21:09.522360 4867 generic.go:334] "Generic (PLEG): container finished" podID="1f764cf1-fc27-439c-8c61-9bc7e241d1fa" containerID="a2bdfd3e39467d563c214005f41e4c5880b6f8912e2d53a492400b8d593da1f1" exitCode=0 Dec 01 09:21:09 crc kubenswrapper[4867]: I1201 09:21:09.523382 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6g9c" event={"ID":"1f764cf1-fc27-439c-8c61-9bc7e241d1fa","Type":"ContainerDied","Data":"a2bdfd3e39467d563c214005f41e4c5880b6f8912e2d53a492400b8d593da1f1"} Dec 01 09:21:10 crc kubenswrapper[4867]: I1201 09:21:10.530376 4867 generic.go:334] "Generic (PLEG): container finished" podID="eada6e99-1300-49a6-8732-f4b2024526dc" containerID="796b842c156d2870f11b7477c44071ca1113e1295a05caaa28b2fc0d76d5606b" exitCode=0 Dec 01 09:21:10 crc kubenswrapper[4867]: I1201 09:21:10.530446 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb" event={"ID":"eada6e99-1300-49a6-8732-f4b2024526dc","Type":"ContainerDied","Data":"796b842c156d2870f11b7477c44071ca1113e1295a05caaa28b2fc0d76d5606b"} Dec 01 09:21:11 crc kubenswrapper[4867]: I1201 09:21:11.762803 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb" Dec 01 09:21:11 crc kubenswrapper[4867]: I1201 09:21:11.935928 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eada6e99-1300-49a6-8732-f4b2024526dc-bundle\") pod \"eada6e99-1300-49a6-8732-f4b2024526dc\" (UID: \"eada6e99-1300-49a6-8732-f4b2024526dc\") " Dec 01 09:21:11 crc kubenswrapper[4867]: I1201 09:21:11.936097 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eada6e99-1300-49a6-8732-f4b2024526dc-util\") pod \"eada6e99-1300-49a6-8732-f4b2024526dc\" (UID: \"eada6e99-1300-49a6-8732-f4b2024526dc\") " Dec 01 09:21:11 crc kubenswrapper[4867]: I1201 09:21:11.936165 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4tjs\" (UniqueName: \"kubernetes.io/projected/eada6e99-1300-49a6-8732-f4b2024526dc-kube-api-access-v4tjs\") pod \"eada6e99-1300-49a6-8732-f4b2024526dc\" (UID: \"eada6e99-1300-49a6-8732-f4b2024526dc\") " Dec 01 09:21:11 crc kubenswrapper[4867]: I1201 09:21:11.940578 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eada6e99-1300-49a6-8732-f4b2024526dc-bundle" (OuterVolumeSpecName: "bundle") pod "eada6e99-1300-49a6-8732-f4b2024526dc" (UID: "eada6e99-1300-49a6-8732-f4b2024526dc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:21:11 crc kubenswrapper[4867]: I1201 09:21:11.948597 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eada6e99-1300-49a6-8732-f4b2024526dc-util" (OuterVolumeSpecName: "util") pod "eada6e99-1300-49a6-8732-f4b2024526dc" (UID: "eada6e99-1300-49a6-8732-f4b2024526dc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:21:11 crc kubenswrapper[4867]: I1201 09:21:11.952529 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eada6e99-1300-49a6-8732-f4b2024526dc-kube-api-access-v4tjs" (OuterVolumeSpecName: "kube-api-access-v4tjs") pod "eada6e99-1300-49a6-8732-f4b2024526dc" (UID: "eada6e99-1300-49a6-8732-f4b2024526dc"). InnerVolumeSpecName "kube-api-access-v4tjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:21:12 crc kubenswrapper[4867]: I1201 09:21:12.040798 4867 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eada6e99-1300-49a6-8732-f4b2024526dc-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:21:12 crc kubenswrapper[4867]: I1201 09:21:12.040888 4867 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eada6e99-1300-49a6-8732-f4b2024526dc-util\") on node \"crc\" DevicePath \"\"" Dec 01 09:21:12 crc kubenswrapper[4867]: I1201 09:21:12.040902 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4tjs\" (UniqueName: \"kubernetes.io/projected/eada6e99-1300-49a6-8732-f4b2024526dc-kube-api-access-v4tjs\") on node \"crc\" DevicePath \"\"" Dec 01 09:21:12 crc kubenswrapper[4867]: I1201 09:21:12.549222 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6g9c" event={"ID":"1f764cf1-fc27-439c-8c61-9bc7e241d1fa","Type":"ContainerStarted","Data":"535c8888a364d10958a6d5eac8a33ae75e0047fa96246fde60b874743a40bc8e"} Dec 01 09:21:12 crc kubenswrapper[4867]: I1201 09:21:12.552632 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb" event={"ID":"eada6e99-1300-49a6-8732-f4b2024526dc","Type":"ContainerDied","Data":"02ba901612d893203db6047cd8827e441f9a41d80c24b261bcc5c64a7f188731"} Dec 01 09:21:12 crc kubenswrapper[4867]: I1201 09:21:12.552686 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02ba901612d893203db6047cd8827e441f9a41d80c24b261bcc5c64a7f188731" Dec 01 09:21:12 crc kubenswrapper[4867]: I1201 09:21:12.552766 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb" Dec 01 09:21:13 crc kubenswrapper[4867]: I1201 09:21:13.559672 4867 generic.go:334] "Generic (PLEG): container finished" podID="1f764cf1-fc27-439c-8c61-9bc7e241d1fa" containerID="535c8888a364d10958a6d5eac8a33ae75e0047fa96246fde60b874743a40bc8e" exitCode=0 Dec 01 09:21:13 crc kubenswrapper[4867]: I1201 09:21:13.559724 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6g9c" event={"ID":"1f764cf1-fc27-439c-8c61-9bc7e241d1fa","Type":"ContainerDied","Data":"535c8888a364d10958a6d5eac8a33ae75e0047fa96246fde60b874743a40bc8e"} Dec 01 09:21:14 crc kubenswrapper[4867]: I1201 09:21:14.567160 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6g9c" event={"ID":"1f764cf1-fc27-439c-8c61-9bc7e241d1fa","Type":"ContainerStarted","Data":"3d668558f0462c91d4caeafce4f865adfb92e90a22e37f46b7e97a673189294c"} Dec 01 09:21:16 crc kubenswrapper[4867]: I1201 09:21:16.128891 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d6g9c" podStartSLOduration=5.367089063 podStartE2EDuration="10.128865376s" podCreationTimestamp="2025-12-01 09:21:06 +0000 UTC" firstStartedPulling="2025-12-01 09:21:09.524748248 +0000 UTC m=+790.984135002" lastFinishedPulling="2025-12-01 09:21:14.286524561 +0000 UTC m=+795.745911315" observedRunningTime="2025-12-01 09:21:14.587678976 +0000 UTC m=+796.047065740" watchObservedRunningTime="2025-12-01 09:21:16.128865376 +0000 UTC m=+797.588252130" Dec 01 09:21:16 crc kubenswrapper[4867]: I1201 09:21:16.135273 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-wthkz"] Dec 01 09:21:16 crc kubenswrapper[4867]: E1201 09:21:16.135812 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eada6e99-1300-49a6-8732-f4b2024526dc" containerName="extract" Dec 01 09:21:16 crc kubenswrapper[4867]: I1201 09:21:16.135851 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="eada6e99-1300-49a6-8732-f4b2024526dc" containerName="extract" Dec 01 09:21:16 crc kubenswrapper[4867]: E1201 09:21:16.135877 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eada6e99-1300-49a6-8732-f4b2024526dc" containerName="pull" Dec 01 09:21:16 crc kubenswrapper[4867]: I1201 09:21:16.135893 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="eada6e99-1300-49a6-8732-f4b2024526dc" containerName="pull" Dec 01 09:21:16 crc kubenswrapper[4867]: E1201 09:21:16.135913 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eada6e99-1300-49a6-8732-f4b2024526dc" containerName="util" Dec 01 09:21:16 crc kubenswrapper[4867]: I1201 09:21:16.135922 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="eada6e99-1300-49a6-8732-f4b2024526dc" containerName="util" Dec 01 09:21:16 crc kubenswrapper[4867]: I1201 09:21:16.136197 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="eada6e99-1300-49a6-8732-f4b2024526dc" containerName="extract" Dec 01 09:21:16 crc kubenswrapper[4867]: I1201 09:21:16.136926 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wthkz" Dec 01 09:21:16 crc kubenswrapper[4867]: I1201 09:21:16.145022 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 01 09:21:16 crc kubenswrapper[4867]: I1201 09:21:16.145277 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 01 09:21:16 crc kubenswrapper[4867]: I1201 09:21:16.145685 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-tcf48" Dec 01 09:21:16 crc kubenswrapper[4867]: I1201 09:21:16.163271 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-wthkz"] Dec 01 09:21:16 crc kubenswrapper[4867]: I1201 09:21:16.298124 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r246w\" (UniqueName: \"kubernetes.io/projected/92290d91-f34b-4ef8-a2a7-15ed05a8c2a5-kube-api-access-r246w\") pod \"nmstate-operator-5b5b58f5c8-wthkz\" (UID: \"92290d91-f34b-4ef8-a2a7-15ed05a8c2a5\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wthkz" Dec 01 09:21:16 crc kubenswrapper[4867]: I1201 09:21:16.400639 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r246w\" (UniqueName: \"kubernetes.io/projected/92290d91-f34b-4ef8-a2a7-15ed05a8c2a5-kube-api-access-r246w\") pod \"nmstate-operator-5b5b58f5c8-wthkz\" (UID: \"92290d91-f34b-4ef8-a2a7-15ed05a8c2a5\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wthkz" Dec 01 09:21:16 crc kubenswrapper[4867]: I1201 09:21:16.421543 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r246w\" (UniqueName: \"kubernetes.io/projected/92290d91-f34b-4ef8-a2a7-15ed05a8c2a5-kube-api-access-r246w\") pod \"nmstate-operator-5b5b58f5c8-wthkz\" (UID: \"92290d91-f34b-4ef8-a2a7-15ed05a8c2a5\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wthkz" Dec 01 09:21:16 crc kubenswrapper[4867]: I1201 09:21:16.464928 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wthkz" Dec 01 09:21:16 crc kubenswrapper[4867]: I1201 09:21:16.891162 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-wthkz"] Dec 01 09:21:16 crc kubenswrapper[4867]: W1201 09:21:16.899238 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92290d91_f34b_4ef8_a2a7_15ed05a8c2a5.slice/crio-a760d17690ad69f727f5727b83618b5430504baba5b0205dee796dde60116ec6 WatchSource:0}: Error finding container a760d17690ad69f727f5727b83618b5430504baba5b0205dee796dde60116ec6: Status 404 returned error can't find the container with id a760d17690ad69f727f5727b83618b5430504baba5b0205dee796dde60116ec6 Dec 01 09:21:17 crc kubenswrapper[4867]: I1201 09:21:17.286006 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d6g9c" Dec 01 09:21:17 crc kubenswrapper[4867]: I1201 09:21:17.286090 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d6g9c" Dec 01 09:21:17 crc kubenswrapper[4867]: I1201 09:21:17.589498 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wthkz" event={"ID":"92290d91-f34b-4ef8-a2a7-15ed05a8c2a5","Type":"ContainerStarted","Data":"a760d17690ad69f727f5727b83618b5430504baba5b0205dee796dde60116ec6"} Dec 01 09:21:18 crc kubenswrapper[4867]: I1201 09:21:18.326543 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d6g9c" podUID="1f764cf1-fc27-439c-8c61-9bc7e241d1fa" containerName="registry-server" probeResult="failure" output=< Dec 01 09:21:18 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Dec 01 09:21:18 crc kubenswrapper[4867]: > Dec 01 09:21:20 crc kubenswrapper[4867]: I1201 09:21:20.607034 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wthkz" event={"ID":"92290d91-f34b-4ef8-a2a7-15ed05a8c2a5","Type":"ContainerStarted","Data":"ad1bbadbb11b86c4c4e883d59edc1bd3a40e6d451663626200787dd9568693c8"} Dec 01 09:21:20 crc kubenswrapper[4867]: I1201 09:21:20.620330 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wthkz" podStartSLOduration=1.175701927 podStartE2EDuration="4.620316634s" podCreationTimestamp="2025-12-01 09:21:16 +0000 UTC" firstStartedPulling="2025-12-01 09:21:16.902036218 +0000 UTC m=+798.361422972" lastFinishedPulling="2025-12-01 09:21:20.346650925 +0000 UTC m=+801.806037679" observedRunningTime="2025-12-01 09:21:20.620057706 +0000 UTC m=+802.079444460" watchObservedRunningTime="2025-12-01 09:21:20.620316634 +0000 UTC m=+802.079703388" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.061736 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-dnfqt"] Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.063600 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-dnfqt" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.080959 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-dnfqt"] Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.081273 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-brq4r" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.091534 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s47c8"] Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.092278 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s47c8" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.100748 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.133088 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-cbknx"] Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.134720 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-cbknx" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.149588 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s47c8"] Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.213158 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b4sd\" (UniqueName: \"kubernetes.io/projected/4749ce2f-6e1e-47ef-a5f1-bdd63f409214-kube-api-access-4b4sd\") pod \"nmstate-metrics-7f946cbc9-dnfqt\" (UID: \"4749ce2f-6e1e-47ef-a5f1-bdd63f409214\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-dnfqt" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.213576 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7c88\" (UniqueName: \"kubernetes.io/projected/e2549111-bcf2-4c87-abdd-0d4cd9353be9-kube-api-access-n7c88\") pod \"nmstate-webhook-5f6d4c5ccb-s47c8\" (UID: \"e2549111-bcf2-4c87-abdd-0d4cd9353be9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s47c8" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.213890 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e2549111-bcf2-4c87-abdd-0d4cd9353be9-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-s47c8\" (UID: \"e2549111-bcf2-4c87-abdd-0d4cd9353be9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s47c8" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.233069 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-fzb6f"] Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.233757 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-fzb6f" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.243070 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.243167 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-tflw9" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.243350 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.252479 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-fzb6f"] Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.314912 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkjnj\" (UniqueName: \"kubernetes.io/projected/7a3fd2df-a271-4ff0-8488-7f442aedf04e-kube-api-access-qkjnj\") pod \"nmstate-handler-cbknx\" (UID: \"7a3fd2df-a271-4ff0-8488-7f442aedf04e\") " pod="openshift-nmstate/nmstate-handler-cbknx" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.314971 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7c88\" (UniqueName: \"kubernetes.io/projected/e2549111-bcf2-4c87-abdd-0d4cd9353be9-kube-api-access-n7c88\") pod \"nmstate-webhook-5f6d4c5ccb-s47c8\" (UID: \"e2549111-bcf2-4c87-abdd-0d4cd9353be9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s47c8" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.315018 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e2549111-bcf2-4c87-abdd-0d4cd9353be9-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-s47c8\" (UID: \"e2549111-bcf2-4c87-abdd-0d4cd9353be9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s47c8" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.321061 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7a3fd2df-a271-4ff0-8488-7f442aedf04e-dbus-socket\") pod \"nmstate-handler-cbknx\" (UID: \"7a3fd2df-a271-4ff0-8488-7f442aedf04e\") " pod="openshift-nmstate/nmstate-handler-cbknx" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.321150 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7a3fd2df-a271-4ff0-8488-7f442aedf04e-ovs-socket\") pod \"nmstate-handler-cbknx\" (UID: \"7a3fd2df-a271-4ff0-8488-7f442aedf04e\") " pod="openshift-nmstate/nmstate-handler-cbknx" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.321177 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7a3fd2df-a271-4ff0-8488-7f442aedf04e-nmstate-lock\") pod \"nmstate-handler-cbknx\" (UID: \"7a3fd2df-a271-4ff0-8488-7f442aedf04e\") " pod="openshift-nmstate/nmstate-handler-cbknx" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.321219 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b4sd\" (UniqueName: \"kubernetes.io/projected/4749ce2f-6e1e-47ef-a5f1-bdd63f409214-kube-api-access-4b4sd\") pod \"nmstate-metrics-7f946cbc9-dnfqt\" (UID: \"4749ce2f-6e1e-47ef-a5f1-bdd63f409214\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-dnfqt" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.331874 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e2549111-bcf2-4c87-abdd-0d4cd9353be9-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-s47c8\" (UID: \"e2549111-bcf2-4c87-abdd-0d4cd9353be9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s47c8" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.347721 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b4sd\" (UniqueName: \"kubernetes.io/projected/4749ce2f-6e1e-47ef-a5f1-bdd63f409214-kube-api-access-4b4sd\") pod \"nmstate-metrics-7f946cbc9-dnfqt\" (UID: \"4749ce2f-6e1e-47ef-a5f1-bdd63f409214\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-dnfqt" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.357998 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7c88\" (UniqueName: \"kubernetes.io/projected/e2549111-bcf2-4c87-abdd-0d4cd9353be9-kube-api-access-n7c88\") pod \"nmstate-webhook-5f6d4c5ccb-s47c8\" (UID: \"e2549111-bcf2-4c87-abdd-0d4cd9353be9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s47c8" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.380381 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-dnfqt" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.414487 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s47c8" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.421962 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7a3fd2df-a271-4ff0-8488-7f442aedf04e-dbus-socket\") pod \"nmstate-handler-cbknx\" (UID: \"7a3fd2df-a271-4ff0-8488-7f442aedf04e\") " pod="openshift-nmstate/nmstate-handler-cbknx" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.422027 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7a3fd2df-a271-4ff0-8488-7f442aedf04e-ovs-socket\") pod \"nmstate-handler-cbknx\" (UID: \"7a3fd2df-a271-4ff0-8488-7f442aedf04e\") " pod="openshift-nmstate/nmstate-handler-cbknx" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.422054 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7a3fd2df-a271-4ff0-8488-7f442aedf04e-nmstate-lock\") pod \"nmstate-handler-cbknx\" (UID: \"7a3fd2df-a271-4ff0-8488-7f442aedf04e\") " pod="openshift-nmstate/nmstate-handler-cbknx" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.422091 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkjnj\" (UniqueName: \"kubernetes.io/projected/7a3fd2df-a271-4ff0-8488-7f442aedf04e-kube-api-access-qkjnj\") pod \"nmstate-handler-cbknx\" (UID: \"7a3fd2df-a271-4ff0-8488-7f442aedf04e\") " pod="openshift-nmstate/nmstate-handler-cbknx" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.422120 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6e4c850-11f6-495b-a90f-5936dda915e7-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-fzb6f\" (UID: \"f6e4c850-11f6-495b-a90f-5936dda915e7\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-fzb6f" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.422167 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n88f\" (UniqueName: \"kubernetes.io/projected/f6e4c850-11f6-495b-a90f-5936dda915e7-kube-api-access-7n88f\") pod \"nmstate-console-plugin-7fbb5f6569-fzb6f\" (UID: \"f6e4c850-11f6-495b-a90f-5936dda915e7\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-fzb6f" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.422193 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f6e4c850-11f6-495b-a90f-5936dda915e7-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-fzb6f\" (UID: \"f6e4c850-11f6-495b-a90f-5936dda915e7\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-fzb6f" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.422280 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7a3fd2df-a271-4ff0-8488-7f442aedf04e-ovs-socket\") pod \"nmstate-handler-cbknx\" (UID: \"7a3fd2df-a271-4ff0-8488-7f442aedf04e\") " pod="openshift-nmstate/nmstate-handler-cbknx" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.422281 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7a3fd2df-a271-4ff0-8488-7f442aedf04e-dbus-socket\") pod \"nmstate-handler-cbknx\" (UID: \"7a3fd2df-a271-4ff0-8488-7f442aedf04e\") " pod="openshift-nmstate/nmstate-handler-cbknx" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.422319 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7a3fd2df-a271-4ff0-8488-7f442aedf04e-nmstate-lock\") pod \"nmstate-handler-cbknx\" (UID: \"7a3fd2df-a271-4ff0-8488-7f442aedf04e\") " pod="openshift-nmstate/nmstate-handler-cbknx" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.459451 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fb7678fb5-m5sb7"] Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.468464 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.462249 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkjnj\" (UniqueName: \"kubernetes.io/projected/7a3fd2df-a271-4ff0-8488-7f442aedf04e-kube-api-access-qkjnj\") pod \"nmstate-handler-cbknx\" (UID: \"7a3fd2df-a271-4ff0-8488-7f442aedf04e\") " pod="openshift-nmstate/nmstate-handler-cbknx" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.487812 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fb7678fb5-m5sb7"] Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.523005 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n88f\" (UniqueName: \"kubernetes.io/projected/f6e4c850-11f6-495b-a90f-5936dda915e7-kube-api-access-7n88f\") pod \"nmstate-console-plugin-7fbb5f6569-fzb6f\" (UID: \"f6e4c850-11f6-495b-a90f-5936dda915e7\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-fzb6f" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.523067 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f6e4c850-11f6-495b-a90f-5936dda915e7-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-fzb6f\" (UID: \"f6e4c850-11f6-495b-a90f-5936dda915e7\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-fzb6f" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.523153 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6e4c850-11f6-495b-a90f-5936dda915e7-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-fzb6f\" (UID: \"f6e4c850-11f6-495b-a90f-5936dda915e7\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-fzb6f" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.524931 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f6e4c850-11f6-495b-a90f-5936dda915e7-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-fzb6f\" (UID: \"f6e4c850-11f6-495b-a90f-5936dda915e7\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-fzb6f" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.528734 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6e4c850-11f6-495b-a90f-5936dda915e7-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-fzb6f\" (UID: \"f6e4c850-11f6-495b-a90f-5936dda915e7\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-fzb6f" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.544916 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n88f\" (UniqueName: \"kubernetes.io/projected/f6e4c850-11f6-495b-a90f-5936dda915e7-kube-api-access-7n88f\") pod \"nmstate-console-plugin-7fbb5f6569-fzb6f\" (UID: \"f6e4c850-11f6-495b-a90f-5936dda915e7\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-fzb6f" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.551442 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-fzb6f" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.623904 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4q4h\" (UniqueName: \"kubernetes.io/projected/440e37f6-4e09-4caa-bd2e-f35ab299e270-kube-api-access-f4q4h\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.623963 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/440e37f6-4e09-4caa-bd2e-f35ab299e270-console-config\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.624014 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/440e37f6-4e09-4caa-bd2e-f35ab299e270-oauth-serving-cert\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.624038 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/440e37f6-4e09-4caa-bd2e-f35ab299e270-trusted-ca-bundle\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.624063 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/440e37f6-4e09-4caa-bd2e-f35ab299e270-console-oauth-config\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.624087 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/440e37f6-4e09-4caa-bd2e-f35ab299e270-service-ca\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.624110 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/440e37f6-4e09-4caa-bd2e-f35ab299e270-console-serving-cert\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.725558 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/440e37f6-4e09-4caa-bd2e-f35ab299e270-console-oauth-config\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.726602 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/440e37f6-4e09-4caa-bd2e-f35ab299e270-service-ca\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.726655 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/440e37f6-4e09-4caa-bd2e-f35ab299e270-console-serving-cert\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.726689 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4q4h\" (UniqueName: \"kubernetes.io/projected/440e37f6-4e09-4caa-bd2e-f35ab299e270-kube-api-access-f4q4h\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.726707 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/440e37f6-4e09-4caa-bd2e-f35ab299e270-console-config\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.726769 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/440e37f6-4e09-4caa-bd2e-f35ab299e270-oauth-serving-cert\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.726829 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/440e37f6-4e09-4caa-bd2e-f35ab299e270-trusted-ca-bundle\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.728214 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/440e37f6-4e09-4caa-bd2e-f35ab299e270-trusted-ca-bundle\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.728464 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/440e37f6-4e09-4caa-bd2e-f35ab299e270-oauth-serving-cert\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.728531 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/440e37f6-4e09-4caa-bd2e-f35ab299e270-console-config\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.729184 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/440e37f6-4e09-4caa-bd2e-f35ab299e270-service-ca\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.730654 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/440e37f6-4e09-4caa-bd2e-f35ab299e270-console-oauth-config\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.732479 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/440e37f6-4e09-4caa-bd2e-f35ab299e270-console-serving-cert\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.749450 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4q4h\" (UniqueName: \"kubernetes.io/projected/440e37f6-4e09-4caa-bd2e-f35ab299e270-kube-api-access-f4q4h\") pod \"console-6fb7678fb5-m5sb7\" (UID: \"440e37f6-4e09-4caa-bd2e-f35ab299e270\") " pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.754379 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-cbknx" Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.788025 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-dnfqt"] Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.790883 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:25 crc kubenswrapper[4867]: W1201 09:21:25.791278 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4749ce2f_6e1e_47ef_a5f1_bdd63f409214.slice/crio-c325d7c39805c32831dfe3ff73f571eb6da097d251f6e7e0ebfdc9eccd96b6ed WatchSource:0}: Error finding container c325d7c39805c32831dfe3ff73f571eb6da097d251f6e7e0ebfdc9eccd96b6ed: Status 404 returned error can't find the container with id c325d7c39805c32831dfe3ff73f571eb6da097d251f6e7e0ebfdc9eccd96b6ed Dec 01 09:21:25 crc kubenswrapper[4867]: I1201 09:21:25.870354 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-fzb6f"] Dec 01 09:21:25 crc kubenswrapper[4867]: W1201 09:21:25.874395 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6e4c850_11f6_495b_a90f_5936dda915e7.slice/crio-6d62d16c6998460009b3263adc09b1e2ddf066c1a317776f27786af0edfd2cc0 WatchSource:0}: Error finding container 6d62d16c6998460009b3263adc09b1e2ddf066c1a317776f27786af0edfd2cc0: Status 404 returned error can't find the container with id 6d62d16c6998460009b3263adc09b1e2ddf066c1a317776f27786af0edfd2cc0 Dec 01 09:21:26 crc kubenswrapper[4867]: I1201 09:21:26.031993 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s47c8"] Dec 01 09:21:26 crc kubenswrapper[4867]: W1201 09:21:26.041758 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2549111_bcf2_4c87_abdd_0d4cd9353be9.slice/crio-47d3a9c496d4e47d5ad0d9d6f535d1304c1b9ef31fc61125e5a1b657f22822e4 WatchSource:0}: Error finding container 47d3a9c496d4e47d5ad0d9d6f535d1304c1b9ef31fc61125e5a1b657f22822e4: Status 404 returned error can't find the container with id 47d3a9c496d4e47d5ad0d9d6f535d1304c1b9ef31fc61125e5a1b657f22822e4 Dec 01 09:21:26 crc kubenswrapper[4867]: I1201 09:21:26.089073 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fb7678fb5-m5sb7"] Dec 01 09:21:26 crc kubenswrapper[4867]: I1201 09:21:26.650535 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-dnfqt" event={"ID":"4749ce2f-6e1e-47ef-a5f1-bdd63f409214","Type":"ContainerStarted","Data":"c325d7c39805c32831dfe3ff73f571eb6da097d251f6e7e0ebfdc9eccd96b6ed"} Dec 01 09:21:26 crc kubenswrapper[4867]: I1201 09:21:26.651832 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-cbknx" event={"ID":"7a3fd2df-a271-4ff0-8488-7f442aedf04e","Type":"ContainerStarted","Data":"eb26303d36c3b3a556f882c278365d6e5a88b56b2711d8837fd06b2a0c73a99f"} Dec 01 09:21:26 crc kubenswrapper[4867]: I1201 09:21:26.652663 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-fzb6f" event={"ID":"f6e4c850-11f6-495b-a90f-5936dda915e7","Type":"ContainerStarted","Data":"6d62d16c6998460009b3263adc09b1e2ddf066c1a317776f27786af0edfd2cc0"} Dec 01 09:21:26 crc kubenswrapper[4867]: I1201 09:21:26.653994 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fb7678fb5-m5sb7" event={"ID":"440e37f6-4e09-4caa-bd2e-f35ab299e270","Type":"ContainerStarted","Data":"ab0152a624332db006b8c64f56c9358e4fcbe7ace09f9c760af0c86c12a99a19"} Dec 01 09:21:26 crc kubenswrapper[4867]: I1201 09:21:26.654837 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s47c8" event={"ID":"e2549111-bcf2-4c87-abdd-0d4cd9353be9","Type":"ContainerStarted","Data":"47d3a9c496d4e47d5ad0d9d6f535d1304c1b9ef31fc61125e5a1b657f22822e4"} Dec 01 09:21:27 crc kubenswrapper[4867]: I1201 09:21:27.326873 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d6g9c" Dec 01 09:21:27 crc kubenswrapper[4867]: I1201 09:21:27.368935 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d6g9c" Dec 01 09:21:27 crc kubenswrapper[4867]: I1201 09:21:27.561535 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d6g9c"] Dec 01 09:21:28 crc kubenswrapper[4867]: I1201 09:21:28.668910 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fb7678fb5-m5sb7" event={"ID":"440e37f6-4e09-4caa-bd2e-f35ab299e270","Type":"ContainerStarted","Data":"fcafa3c24536c038f6fd4fec632605f77cc3c7bcc4b90d21834e07d0ae92071e"} Dec 01 09:21:28 crc kubenswrapper[4867]: I1201 09:21:28.669112 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d6g9c" podUID="1f764cf1-fc27-439c-8c61-9bc7e241d1fa" containerName="registry-server" containerID="cri-o://3d668558f0462c91d4caeafce4f865adfb92e90a22e37f46b7e97a673189294c" gracePeriod=2 Dec 01 09:21:28 crc kubenswrapper[4867]: I1201 09:21:28.694025 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fb7678fb5-m5sb7" podStartSLOduration=3.69400575 podStartE2EDuration="3.69400575s" podCreationTimestamp="2025-12-01 09:21:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:21:28.688745913 +0000 UTC m=+810.148132667" watchObservedRunningTime="2025-12-01 09:21:28.69400575 +0000 UTC m=+810.153392494" Dec 01 09:21:29 crc kubenswrapper[4867]: I1201 09:21:29.677400 4867 generic.go:334] "Generic (PLEG): container finished" podID="1f764cf1-fc27-439c-8c61-9bc7e241d1fa" containerID="3d668558f0462c91d4caeafce4f865adfb92e90a22e37f46b7e97a673189294c" exitCode=0 Dec 01 09:21:29 crc kubenswrapper[4867]: I1201 09:21:29.677479 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6g9c" event={"ID":"1f764cf1-fc27-439c-8c61-9bc7e241d1fa","Type":"ContainerDied","Data":"3d668558f0462c91d4caeafce4f865adfb92e90a22e37f46b7e97a673189294c"} Dec 01 09:21:29 crc kubenswrapper[4867]: I1201 09:21:29.952268 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d6g9c" Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.091337 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr9bn\" (UniqueName: \"kubernetes.io/projected/1f764cf1-fc27-439c-8c61-9bc7e241d1fa-kube-api-access-sr9bn\") pod \"1f764cf1-fc27-439c-8c61-9bc7e241d1fa\" (UID: \"1f764cf1-fc27-439c-8c61-9bc7e241d1fa\") " Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.091791 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f764cf1-fc27-439c-8c61-9bc7e241d1fa-utilities\") pod \"1f764cf1-fc27-439c-8c61-9bc7e241d1fa\" (UID: \"1f764cf1-fc27-439c-8c61-9bc7e241d1fa\") " Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.091835 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f764cf1-fc27-439c-8c61-9bc7e241d1fa-catalog-content\") pod \"1f764cf1-fc27-439c-8c61-9bc7e241d1fa\" (UID: \"1f764cf1-fc27-439c-8c61-9bc7e241d1fa\") " Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.092650 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f764cf1-fc27-439c-8c61-9bc7e241d1fa-utilities" (OuterVolumeSpecName: "utilities") pod "1f764cf1-fc27-439c-8c61-9bc7e241d1fa" (UID: "1f764cf1-fc27-439c-8c61-9bc7e241d1fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.095781 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f764cf1-fc27-439c-8c61-9bc7e241d1fa-kube-api-access-sr9bn" (OuterVolumeSpecName: "kube-api-access-sr9bn") pod "1f764cf1-fc27-439c-8c61-9bc7e241d1fa" (UID: "1f764cf1-fc27-439c-8c61-9bc7e241d1fa"). InnerVolumeSpecName "kube-api-access-sr9bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.193694 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr9bn\" (UniqueName: \"kubernetes.io/projected/1f764cf1-fc27-439c-8c61-9bc7e241d1fa-kube-api-access-sr9bn\") on node \"crc\" DevicePath \"\"" Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.193757 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f764cf1-fc27-439c-8c61-9bc7e241d1fa-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.211743 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f764cf1-fc27-439c-8c61-9bc7e241d1fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f764cf1-fc27-439c-8c61-9bc7e241d1fa" (UID: "1f764cf1-fc27-439c-8c61-9bc7e241d1fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.294748 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f764cf1-fc27-439c-8c61-9bc7e241d1fa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.691758 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s47c8" event={"ID":"e2549111-bcf2-4c87-abdd-0d4cd9353be9","Type":"ContainerStarted","Data":"30f5f3930fc52f7e099819f6fa7009a2f19a6743357d59eb4195e71baee16654"} Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.691935 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s47c8" Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.695255 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-dnfqt" event={"ID":"4749ce2f-6e1e-47ef-a5f1-bdd63f409214","Type":"ContainerStarted","Data":"c89eb14d861531c4b54ad31f3802fe4e3867fa3fbac51fe078a6a26fb5bad886"} Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.697411 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-cbknx" event={"ID":"7a3fd2df-a271-4ff0-8488-7f442aedf04e","Type":"ContainerStarted","Data":"df63f607b8588494e950fa13846f5264e4626f079a9b25dc788e755e15c7cfcd"} Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.698064 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-cbknx" Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.700695 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-fzb6f" event={"ID":"f6e4c850-11f6-495b-a90f-5936dda915e7","Type":"ContainerStarted","Data":"82142c7f33108922ba1fbebe92f877ce4168fa5f2c48031bb25c44f2b799edcf"} Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.703330 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6g9c" event={"ID":"1f764cf1-fc27-439c-8c61-9bc7e241d1fa","Type":"ContainerDied","Data":"8ac790cccc7ab1047467dc698e03f04bc9a76e2625eed39f11ff2e59014b984c"} Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.703365 4867 scope.go:117] "RemoveContainer" containerID="3d668558f0462c91d4caeafce4f865adfb92e90a22e37f46b7e97a673189294c" Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.703452 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d6g9c" Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.712006 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s47c8" podStartSLOduration=1.7685610459999999 podStartE2EDuration="5.711983786s" podCreationTimestamp="2025-12-01 09:21:25 +0000 UTC" firstStartedPulling="2025-12-01 09:21:26.043715044 +0000 UTC m=+807.503101798" lastFinishedPulling="2025-12-01 09:21:29.987137784 +0000 UTC m=+811.446524538" observedRunningTime="2025-12-01 09:21:30.706718199 +0000 UTC m=+812.166104953" watchObservedRunningTime="2025-12-01 09:21:30.711983786 +0000 UTC m=+812.171370540" Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.739111 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-cbknx" podStartSLOduration=1.534027661 podStartE2EDuration="5.739088983s" podCreationTimestamp="2025-12-01 09:21:25 +0000 UTC" firstStartedPulling="2025-12-01 09:21:25.784251812 +0000 UTC m=+807.243638566" lastFinishedPulling="2025-12-01 09:21:29.989313134 +0000 UTC m=+811.448699888" observedRunningTime="2025-12-01 09:21:30.725725699 +0000 UTC m=+812.185112463" watchObservedRunningTime="2025-12-01 09:21:30.739088983 +0000 UTC m=+812.198475737" Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.741504 4867 scope.go:117] "RemoveContainer" containerID="535c8888a364d10958a6d5eac8a33ae75e0047fa96246fde60b874743a40bc8e" Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.749451 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-fzb6f" podStartSLOduration=1.6682669369999998 podStartE2EDuration="5.749426792s" podCreationTimestamp="2025-12-01 09:21:25 +0000 UTC" firstStartedPulling="2025-12-01 09:21:25.890634131 +0000 UTC m=+807.350020885" lastFinishedPulling="2025-12-01 09:21:29.971793596 +0000 UTC m=+811.431180740" observedRunningTime="2025-12-01 09:21:30.741313955 +0000 UTC m=+812.200700709" watchObservedRunningTime="2025-12-01 09:21:30.749426792 +0000 UTC m=+812.208813556" Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.788947 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d6g9c"] Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.789221 4867 scope.go:117] "RemoveContainer" containerID="a2bdfd3e39467d563c214005f41e4c5880b6f8912e2d53a492400b8d593da1f1" Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.794496 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d6g9c"] Dec 01 09:21:30 crc kubenswrapper[4867]: I1201 09:21:30.838569 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f764cf1-fc27-439c-8c61-9bc7e241d1fa" path="/var/lib/kubelet/pods/1f764cf1-fc27-439c-8c61-9bc7e241d1fa/volumes" Dec 01 09:21:33 crc kubenswrapper[4867]: I1201 09:21:33.724441 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-dnfqt" event={"ID":"4749ce2f-6e1e-47ef-a5f1-bdd63f409214","Type":"ContainerStarted","Data":"99e6fc8a40c5400f2b904e97b9e1568dc430b74a483ff6737a83969fb2a946f4"} Dec 01 09:21:33 crc kubenswrapper[4867]: I1201 09:21:33.748923 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-dnfqt" podStartSLOduration=1.066228812 podStartE2EDuration="8.748901983s" podCreationTimestamp="2025-12-01 09:21:25 +0000 UTC" firstStartedPulling="2025-12-01 09:21:25.802922873 +0000 UTC m=+807.262309627" lastFinishedPulling="2025-12-01 09:21:33.485596044 +0000 UTC m=+814.944982798" observedRunningTime="2025-12-01 09:21:33.746392343 +0000 UTC m=+815.205779117" watchObservedRunningTime="2025-12-01 09:21:33.748901983 +0000 UTC m=+815.208288737" Dec 01 09:21:35 crc kubenswrapper[4867]: I1201 09:21:35.776260 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-cbknx" Dec 01 09:21:35 crc kubenswrapper[4867]: I1201 09:21:35.792874 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:35 crc kubenswrapper[4867]: I1201 09:21:35.792945 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:35 crc kubenswrapper[4867]: I1201 09:21:35.805084 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:36 crc kubenswrapper[4867]: I1201 09:21:36.746573 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6fb7678fb5-m5sb7" Dec 01 09:21:36 crc kubenswrapper[4867]: I1201 09:21:36.805509 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-kdm2m"] Dec 01 09:21:45 crc kubenswrapper[4867]: I1201 09:21:45.421416 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s47c8" Dec 01 09:21:57 crc kubenswrapper[4867]: I1201 09:21:57.491279 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb"] Dec 01 09:21:57 crc kubenswrapper[4867]: E1201 09:21:57.492996 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f764cf1-fc27-439c-8c61-9bc7e241d1fa" containerName="registry-server" Dec 01 09:21:57 crc kubenswrapper[4867]: I1201 09:21:57.493087 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f764cf1-fc27-439c-8c61-9bc7e241d1fa" containerName="registry-server" Dec 01 09:21:57 crc kubenswrapper[4867]: E1201 09:21:57.493161 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f764cf1-fc27-439c-8c61-9bc7e241d1fa" containerName="extract-content" Dec 01 09:21:57 crc kubenswrapper[4867]: I1201 09:21:57.493230 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f764cf1-fc27-439c-8c61-9bc7e241d1fa" containerName="extract-content" Dec 01 09:21:57 crc kubenswrapper[4867]: E1201 09:21:57.493302 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f764cf1-fc27-439c-8c61-9bc7e241d1fa" containerName="extract-utilities" Dec 01 09:21:57 crc kubenswrapper[4867]: I1201 09:21:57.493367 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f764cf1-fc27-439c-8c61-9bc7e241d1fa" containerName="extract-utilities" Dec 01 09:21:57 crc kubenswrapper[4867]: I1201 09:21:57.493546 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f764cf1-fc27-439c-8c61-9bc7e241d1fa" containerName="registry-server" Dec 01 09:21:57 crc kubenswrapper[4867]: I1201 09:21:57.494537 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb" Dec 01 09:21:57 crc kubenswrapper[4867]: I1201 09:21:57.507617 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb"] Dec 01 09:21:57 crc kubenswrapper[4867]: I1201 09:21:57.513793 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 09:21:57 crc kubenswrapper[4867]: I1201 09:21:57.550305 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22246b0d-5ca8-4aa8-9cb5-0942b473e733-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb\" (UID: \"22246b0d-5ca8-4aa8-9cb5-0942b473e733\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb" Dec 01 09:21:57 crc kubenswrapper[4867]: I1201 09:21:57.550361 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22246b0d-5ca8-4aa8-9cb5-0942b473e733-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb\" (UID: \"22246b0d-5ca8-4aa8-9cb5-0942b473e733\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb" Dec 01 09:21:57 crc kubenswrapper[4867]: I1201 09:21:57.550387 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h7lf\" (UniqueName: \"kubernetes.io/projected/22246b0d-5ca8-4aa8-9cb5-0942b473e733-kube-api-access-7h7lf\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb\" (UID: \"22246b0d-5ca8-4aa8-9cb5-0942b473e733\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb" Dec 01 09:21:57 crc kubenswrapper[4867]: I1201 09:21:57.651793 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22246b0d-5ca8-4aa8-9cb5-0942b473e733-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb\" (UID: \"22246b0d-5ca8-4aa8-9cb5-0942b473e733\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb" Dec 01 09:21:57 crc kubenswrapper[4867]: I1201 09:21:57.651853 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22246b0d-5ca8-4aa8-9cb5-0942b473e733-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb\" (UID: \"22246b0d-5ca8-4aa8-9cb5-0942b473e733\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb" Dec 01 09:21:57 crc kubenswrapper[4867]: I1201 09:21:57.651872 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h7lf\" (UniqueName: \"kubernetes.io/projected/22246b0d-5ca8-4aa8-9cb5-0942b473e733-kube-api-access-7h7lf\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb\" (UID: \"22246b0d-5ca8-4aa8-9cb5-0942b473e733\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb" Dec 01 09:21:57 crc kubenswrapper[4867]: I1201 09:21:57.652427 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22246b0d-5ca8-4aa8-9cb5-0942b473e733-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb\" (UID: \"22246b0d-5ca8-4aa8-9cb5-0942b473e733\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb" Dec 01 09:21:57 crc kubenswrapper[4867]: I1201 09:21:57.652702 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22246b0d-5ca8-4aa8-9cb5-0942b473e733-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb\" (UID: \"22246b0d-5ca8-4aa8-9cb5-0942b473e733\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb" Dec 01 09:21:57 crc kubenswrapper[4867]: I1201 09:21:57.669255 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h7lf\" (UniqueName: \"kubernetes.io/projected/22246b0d-5ca8-4aa8-9cb5-0942b473e733-kube-api-access-7h7lf\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb\" (UID: \"22246b0d-5ca8-4aa8-9cb5-0942b473e733\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb" Dec 01 09:21:57 crc kubenswrapper[4867]: I1201 09:21:57.811769 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb" Dec 01 09:21:58 crc kubenswrapper[4867]: I1201 09:21:58.030909 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb"] Dec 01 09:21:58 crc kubenswrapper[4867]: I1201 09:21:58.899868 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb" event={"ID":"22246b0d-5ca8-4aa8-9cb5-0942b473e733","Type":"ContainerStarted","Data":"e4087384ab4b1356b6a494cd9cec7c6a409f69804b7602b5e695a10e49d42b6d"} Dec 01 09:21:58 crc kubenswrapper[4867]: I1201 09:21:58.900273 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb" event={"ID":"22246b0d-5ca8-4aa8-9cb5-0942b473e733","Type":"ContainerStarted","Data":"886832bc6dbd470faeb7c657e3a758223321291454ed348fc532ba601d01aebc"} Dec 01 09:21:59 crc kubenswrapper[4867]: I1201 09:21:59.907656 4867 generic.go:334] "Generic (PLEG): container finished" podID="22246b0d-5ca8-4aa8-9cb5-0942b473e733" containerID="e4087384ab4b1356b6a494cd9cec7c6a409f69804b7602b5e695a10e49d42b6d" exitCode=0 Dec 01 09:21:59 crc kubenswrapper[4867]: I1201 09:21:59.907878 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb" event={"ID":"22246b0d-5ca8-4aa8-9cb5-0942b473e733","Type":"ContainerDied","Data":"e4087384ab4b1356b6a494cd9cec7c6a409f69804b7602b5e695a10e49d42b6d"} Dec 01 09:22:01 crc kubenswrapper[4867]: I1201 09:22:01.846211 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-kdm2m" podUID="ec06a7ff-9325-4de9-b47e-d8315761bf8d" containerName="console" containerID="cri-o://004aaa5e09b068e15bd178a988aa4767f5bf55589cd10f4b938d0647ea7f02f1" gracePeriod=15 Dec 01 09:22:01 crc kubenswrapper[4867]: I1201 09:22:01.929376 4867 generic.go:334] "Generic (PLEG): container finished" podID="22246b0d-5ca8-4aa8-9cb5-0942b473e733" containerID="57ffa157a380ec3e23e7147a54905d1c749fa27a5330d740f9ab02dac30f5374" exitCode=0 Dec 01 09:22:01 crc kubenswrapper[4867]: I1201 09:22:01.929425 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb" event={"ID":"22246b0d-5ca8-4aa8-9cb5-0942b473e733","Type":"ContainerDied","Data":"57ffa157a380ec3e23e7147a54905d1c749fa27a5330d740f9ab02dac30f5374"} Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.239800 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-kdm2m_ec06a7ff-9325-4de9-b47e-d8315761bf8d/console/0.log" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.239900 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.412721 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-console-config\") pod \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.412762 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec06a7ff-9325-4de9-b47e-d8315761bf8d-console-oauth-config\") pod \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.412796 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4xws\" (UniqueName: \"kubernetes.io/projected/ec06a7ff-9325-4de9-b47e-d8315761bf8d-kube-api-access-q4xws\") pod \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.412833 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-oauth-serving-cert\") pod \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.412864 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-service-ca\") pod \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.412884 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec06a7ff-9325-4de9-b47e-d8315761bf8d-console-serving-cert\") pod \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.412908 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-trusted-ca-bundle\") pod \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\" (UID: \"ec06a7ff-9325-4de9-b47e-d8315761bf8d\") " Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.413741 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-console-config" (OuterVolumeSpecName: "console-config") pod "ec06a7ff-9325-4de9-b47e-d8315761bf8d" (UID: "ec06a7ff-9325-4de9-b47e-d8315761bf8d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.413764 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-service-ca" (OuterVolumeSpecName: "service-ca") pod "ec06a7ff-9325-4de9-b47e-d8315761bf8d" (UID: "ec06a7ff-9325-4de9-b47e-d8315761bf8d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.413755 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ec06a7ff-9325-4de9-b47e-d8315761bf8d" (UID: "ec06a7ff-9325-4de9-b47e-d8315761bf8d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.414134 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ec06a7ff-9325-4de9-b47e-d8315761bf8d" (UID: "ec06a7ff-9325-4de9-b47e-d8315761bf8d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.418024 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec06a7ff-9325-4de9-b47e-d8315761bf8d-kube-api-access-q4xws" (OuterVolumeSpecName: "kube-api-access-q4xws") pod "ec06a7ff-9325-4de9-b47e-d8315761bf8d" (UID: "ec06a7ff-9325-4de9-b47e-d8315761bf8d"). InnerVolumeSpecName "kube-api-access-q4xws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.418304 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec06a7ff-9325-4de9-b47e-d8315761bf8d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ec06a7ff-9325-4de9-b47e-d8315761bf8d" (UID: "ec06a7ff-9325-4de9-b47e-d8315761bf8d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.418925 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec06a7ff-9325-4de9-b47e-d8315761bf8d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ec06a7ff-9325-4de9-b47e-d8315761bf8d" (UID: "ec06a7ff-9325-4de9-b47e-d8315761bf8d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.514064 4867 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.514094 4867 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec06a7ff-9325-4de9-b47e-d8315761bf8d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.514105 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4xws\" (UniqueName: \"kubernetes.io/projected/ec06a7ff-9325-4de9-b47e-d8315761bf8d-kube-api-access-q4xws\") on node \"crc\" DevicePath \"\"" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.514114 4867 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.514122 4867 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.514131 4867 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec06a7ff-9325-4de9-b47e-d8315761bf8d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.514141 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec06a7ff-9325-4de9-b47e-d8315761bf8d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.942169 4867 generic.go:334] "Generic (PLEG): container finished" podID="22246b0d-5ca8-4aa8-9cb5-0942b473e733" containerID="fb7b0fedff19a89d561d60b7e15e0938efdb9d4586cacc3858a1815582431c14" exitCode=0 Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.942299 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb" event={"ID":"22246b0d-5ca8-4aa8-9cb5-0942b473e733","Type":"ContainerDied","Data":"fb7b0fedff19a89d561d60b7e15e0938efdb9d4586cacc3858a1815582431c14"} Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.944540 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-kdm2m_ec06a7ff-9325-4de9-b47e-d8315761bf8d/console/0.log" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.944621 4867 generic.go:334] "Generic (PLEG): container finished" podID="ec06a7ff-9325-4de9-b47e-d8315761bf8d" containerID="004aaa5e09b068e15bd178a988aa4767f5bf55589cd10f4b938d0647ea7f02f1" exitCode=2 Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.944715 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kdm2m" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.944715 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kdm2m" event={"ID":"ec06a7ff-9325-4de9-b47e-d8315761bf8d","Type":"ContainerDied","Data":"004aaa5e09b068e15bd178a988aa4767f5bf55589cd10f4b938d0647ea7f02f1"} Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.944888 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kdm2m" event={"ID":"ec06a7ff-9325-4de9-b47e-d8315761bf8d","Type":"ContainerDied","Data":"84acfbda6f82e89267ec39872d471581fcb752c396e4b865f01f34d7f4ce009d"} Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.944929 4867 scope.go:117] "RemoveContainer" containerID="004aaa5e09b068e15bd178a988aa4767f5bf55589cd10f4b938d0647ea7f02f1" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.966490 4867 scope.go:117] "RemoveContainer" containerID="004aaa5e09b068e15bd178a988aa4767f5bf55589cd10f4b938d0647ea7f02f1" Dec 01 09:22:02 crc kubenswrapper[4867]: E1201 09:22:02.967073 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"004aaa5e09b068e15bd178a988aa4767f5bf55589cd10f4b938d0647ea7f02f1\": container with ID starting with 004aaa5e09b068e15bd178a988aa4767f5bf55589cd10f4b938d0647ea7f02f1 not found: ID does not exist" containerID="004aaa5e09b068e15bd178a988aa4767f5bf55589cd10f4b938d0647ea7f02f1" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.967127 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"004aaa5e09b068e15bd178a988aa4767f5bf55589cd10f4b938d0647ea7f02f1"} err="failed to get container status \"004aaa5e09b068e15bd178a988aa4767f5bf55589cd10f4b938d0647ea7f02f1\": rpc error: code = NotFound desc = could not find container \"004aaa5e09b068e15bd178a988aa4767f5bf55589cd10f4b938d0647ea7f02f1\": container with ID starting with 004aaa5e09b068e15bd178a988aa4767f5bf55589cd10f4b938d0647ea7f02f1 not found: ID does not exist" Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.991620 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-kdm2m"] Dec 01 09:22:02 crc kubenswrapper[4867]: I1201 09:22:02.997411 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-kdm2m"] Dec 01 09:22:04 crc kubenswrapper[4867]: I1201 09:22:04.239616 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb" Dec 01 09:22:04 crc kubenswrapper[4867]: I1201 09:22:04.440024 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h7lf\" (UniqueName: \"kubernetes.io/projected/22246b0d-5ca8-4aa8-9cb5-0942b473e733-kube-api-access-7h7lf\") pod \"22246b0d-5ca8-4aa8-9cb5-0942b473e733\" (UID: \"22246b0d-5ca8-4aa8-9cb5-0942b473e733\") " Dec 01 09:22:04 crc kubenswrapper[4867]: I1201 09:22:04.440553 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22246b0d-5ca8-4aa8-9cb5-0942b473e733-bundle\") pod \"22246b0d-5ca8-4aa8-9cb5-0942b473e733\" (UID: \"22246b0d-5ca8-4aa8-9cb5-0942b473e733\") " Dec 01 09:22:04 crc kubenswrapper[4867]: I1201 09:22:04.440689 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22246b0d-5ca8-4aa8-9cb5-0942b473e733-util\") pod \"22246b0d-5ca8-4aa8-9cb5-0942b473e733\" (UID: \"22246b0d-5ca8-4aa8-9cb5-0942b473e733\") " Dec 01 09:22:04 crc kubenswrapper[4867]: I1201 09:22:04.441548 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22246b0d-5ca8-4aa8-9cb5-0942b473e733-bundle" (OuterVolumeSpecName: "bundle") pod "22246b0d-5ca8-4aa8-9cb5-0942b473e733" (UID: "22246b0d-5ca8-4aa8-9cb5-0942b473e733"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:22:04 crc kubenswrapper[4867]: I1201 09:22:04.447759 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22246b0d-5ca8-4aa8-9cb5-0942b473e733-kube-api-access-7h7lf" (OuterVolumeSpecName: "kube-api-access-7h7lf") pod "22246b0d-5ca8-4aa8-9cb5-0942b473e733" (UID: "22246b0d-5ca8-4aa8-9cb5-0942b473e733"). InnerVolumeSpecName "kube-api-access-7h7lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:22:04 crc kubenswrapper[4867]: I1201 09:22:04.450192 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22246b0d-5ca8-4aa8-9cb5-0942b473e733-util" (OuterVolumeSpecName: "util") pod "22246b0d-5ca8-4aa8-9cb5-0942b473e733" (UID: "22246b0d-5ca8-4aa8-9cb5-0942b473e733"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:22:04 crc kubenswrapper[4867]: I1201 09:22:04.541899 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h7lf\" (UniqueName: \"kubernetes.io/projected/22246b0d-5ca8-4aa8-9cb5-0942b473e733-kube-api-access-7h7lf\") on node \"crc\" DevicePath \"\"" Dec 01 09:22:04 crc kubenswrapper[4867]: I1201 09:22:04.541946 4867 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22246b0d-5ca8-4aa8-9cb5-0942b473e733-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:22:04 crc kubenswrapper[4867]: I1201 09:22:04.541959 4867 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22246b0d-5ca8-4aa8-9cb5-0942b473e733-util\") on node \"crc\" DevicePath \"\"" Dec 01 09:22:04 crc kubenswrapper[4867]: I1201 09:22:04.839185 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec06a7ff-9325-4de9-b47e-d8315761bf8d" path="/var/lib/kubelet/pods/ec06a7ff-9325-4de9-b47e-d8315761bf8d/volumes" Dec 01 09:22:04 crc kubenswrapper[4867]: I1201 09:22:04.965523 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb" event={"ID":"22246b0d-5ca8-4aa8-9cb5-0942b473e733","Type":"ContainerDied","Data":"886832bc6dbd470faeb7c657e3a758223321291454ed348fc532ba601d01aebc"} Dec 01 09:22:04 crc kubenswrapper[4867]: I1201 09:22:04.965563 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="886832bc6dbd470faeb7c657e3a758223321291454ed348fc532ba601d01aebc" Dec 01 09:22:04 crc kubenswrapper[4867]: I1201 09:22:04.965659 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.067761 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d79d8d46b-pxjk5"] Dec 01 09:22:16 crc kubenswrapper[4867]: E1201 09:22:16.068454 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22246b0d-5ca8-4aa8-9cb5-0942b473e733" containerName="util" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.068467 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="22246b0d-5ca8-4aa8-9cb5-0942b473e733" containerName="util" Dec 01 09:22:16 crc kubenswrapper[4867]: E1201 09:22:16.068476 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec06a7ff-9325-4de9-b47e-d8315761bf8d" containerName="console" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.068482 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec06a7ff-9325-4de9-b47e-d8315761bf8d" containerName="console" Dec 01 09:22:16 crc kubenswrapper[4867]: E1201 09:22:16.068498 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22246b0d-5ca8-4aa8-9cb5-0942b473e733" containerName="pull" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.068504 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="22246b0d-5ca8-4aa8-9cb5-0942b473e733" containerName="pull" Dec 01 09:22:16 crc kubenswrapper[4867]: E1201 09:22:16.068514 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22246b0d-5ca8-4aa8-9cb5-0942b473e733" containerName="extract" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.068520 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="22246b0d-5ca8-4aa8-9cb5-0942b473e733" containerName="extract" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.068621 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="22246b0d-5ca8-4aa8-9cb5-0942b473e733" containerName="extract" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.068632 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec06a7ff-9325-4de9-b47e-d8315761bf8d" containerName="console" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.069025 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6d79d8d46b-pxjk5" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.074852 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.074962 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.075031 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.076326 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.076431 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-qp8n2" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.102043 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d79d8d46b-pxjk5"] Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.177384 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82e433dd-78d1-4cb0-a670-e19c67e09515-webhook-cert\") pod \"metallb-operator-controller-manager-6d79d8d46b-pxjk5\" (UID: \"82e433dd-78d1-4cb0-a670-e19c67e09515\") " pod="metallb-system/metallb-operator-controller-manager-6d79d8d46b-pxjk5" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.177694 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82e433dd-78d1-4cb0-a670-e19c67e09515-apiservice-cert\") pod \"metallb-operator-controller-manager-6d79d8d46b-pxjk5\" (UID: \"82e433dd-78d1-4cb0-a670-e19c67e09515\") " pod="metallb-system/metallb-operator-controller-manager-6d79d8d46b-pxjk5" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.177751 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r4dc\" (UniqueName: \"kubernetes.io/projected/82e433dd-78d1-4cb0-a670-e19c67e09515-kube-api-access-4r4dc\") pod \"metallb-operator-controller-manager-6d79d8d46b-pxjk5\" (UID: \"82e433dd-78d1-4cb0-a670-e19c67e09515\") " pod="metallb-system/metallb-operator-controller-manager-6d79d8d46b-pxjk5" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.282958 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82e433dd-78d1-4cb0-a670-e19c67e09515-apiservice-cert\") pod \"metallb-operator-controller-manager-6d79d8d46b-pxjk5\" (UID: \"82e433dd-78d1-4cb0-a670-e19c67e09515\") " pod="metallb-system/metallb-operator-controller-manager-6d79d8d46b-pxjk5" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.283030 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r4dc\" (UniqueName: \"kubernetes.io/projected/82e433dd-78d1-4cb0-a670-e19c67e09515-kube-api-access-4r4dc\") pod \"metallb-operator-controller-manager-6d79d8d46b-pxjk5\" (UID: \"82e433dd-78d1-4cb0-a670-e19c67e09515\") " pod="metallb-system/metallb-operator-controller-manager-6d79d8d46b-pxjk5" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.283112 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82e433dd-78d1-4cb0-a670-e19c67e09515-webhook-cert\") pod \"metallb-operator-controller-manager-6d79d8d46b-pxjk5\" (UID: \"82e433dd-78d1-4cb0-a670-e19c67e09515\") " pod="metallb-system/metallb-operator-controller-manager-6d79d8d46b-pxjk5" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.295672 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82e433dd-78d1-4cb0-a670-e19c67e09515-apiservice-cert\") pod \"metallb-operator-controller-manager-6d79d8d46b-pxjk5\" (UID: \"82e433dd-78d1-4cb0-a670-e19c67e09515\") " pod="metallb-system/metallb-operator-controller-manager-6d79d8d46b-pxjk5" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.298506 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82e433dd-78d1-4cb0-a670-e19c67e09515-webhook-cert\") pod \"metallb-operator-controller-manager-6d79d8d46b-pxjk5\" (UID: \"82e433dd-78d1-4cb0-a670-e19c67e09515\") " pod="metallb-system/metallb-operator-controller-manager-6d79d8d46b-pxjk5" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.306719 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r4dc\" (UniqueName: \"kubernetes.io/projected/82e433dd-78d1-4cb0-a670-e19c67e09515-kube-api-access-4r4dc\") pod \"metallb-operator-controller-manager-6d79d8d46b-pxjk5\" (UID: \"82e433dd-78d1-4cb0-a670-e19c67e09515\") " pod="metallb-system/metallb-operator-controller-manager-6d79d8d46b-pxjk5" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.366239 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6f489b594c-qqhv4"] Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.367075 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6f489b594c-qqhv4" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.369860 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.370594 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-n9fpv" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.370617 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.383998 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6f489b594c-qqhv4"] Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.384734 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb69f179-7caf-472c-9b20-f327c116f4a2-apiservice-cert\") pod \"metallb-operator-webhook-server-6f489b594c-qqhv4\" (UID: \"cb69f179-7caf-472c-9b20-f327c116f4a2\") " pod="metallb-system/metallb-operator-webhook-server-6f489b594c-qqhv4" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.384791 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52fzj\" (UniqueName: \"kubernetes.io/projected/cb69f179-7caf-472c-9b20-f327c116f4a2-kube-api-access-52fzj\") pod \"metallb-operator-webhook-server-6f489b594c-qqhv4\" (UID: \"cb69f179-7caf-472c-9b20-f327c116f4a2\") " pod="metallb-system/metallb-operator-webhook-server-6f489b594c-qqhv4" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.384829 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb69f179-7caf-472c-9b20-f327c116f4a2-webhook-cert\") pod \"metallb-operator-webhook-server-6f489b594c-qqhv4\" (UID: \"cb69f179-7caf-472c-9b20-f327c116f4a2\") " pod="metallb-system/metallb-operator-webhook-server-6f489b594c-qqhv4" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.385318 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6d79d8d46b-pxjk5" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.485573 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb69f179-7caf-472c-9b20-f327c116f4a2-apiservice-cert\") pod \"metallb-operator-webhook-server-6f489b594c-qqhv4\" (UID: \"cb69f179-7caf-472c-9b20-f327c116f4a2\") " pod="metallb-system/metallb-operator-webhook-server-6f489b594c-qqhv4" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.485613 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52fzj\" (UniqueName: \"kubernetes.io/projected/cb69f179-7caf-472c-9b20-f327c116f4a2-kube-api-access-52fzj\") pod \"metallb-operator-webhook-server-6f489b594c-qqhv4\" (UID: \"cb69f179-7caf-472c-9b20-f327c116f4a2\") " pod="metallb-system/metallb-operator-webhook-server-6f489b594c-qqhv4" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.485640 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb69f179-7caf-472c-9b20-f327c116f4a2-webhook-cert\") pod \"metallb-operator-webhook-server-6f489b594c-qqhv4\" (UID: \"cb69f179-7caf-472c-9b20-f327c116f4a2\") " pod="metallb-system/metallb-operator-webhook-server-6f489b594c-qqhv4" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.493668 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb69f179-7caf-472c-9b20-f327c116f4a2-webhook-cert\") pod \"metallb-operator-webhook-server-6f489b594c-qqhv4\" (UID: \"cb69f179-7caf-472c-9b20-f327c116f4a2\") " pod="metallb-system/metallb-operator-webhook-server-6f489b594c-qqhv4" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.501536 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb69f179-7caf-472c-9b20-f327c116f4a2-apiservice-cert\") pod \"metallb-operator-webhook-server-6f489b594c-qqhv4\" (UID: \"cb69f179-7caf-472c-9b20-f327c116f4a2\") " pod="metallb-system/metallb-operator-webhook-server-6f489b594c-qqhv4" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.512689 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52fzj\" (UniqueName: \"kubernetes.io/projected/cb69f179-7caf-472c-9b20-f327c116f4a2-kube-api-access-52fzj\") pod \"metallb-operator-webhook-server-6f489b594c-qqhv4\" (UID: \"cb69f179-7caf-472c-9b20-f327c116f4a2\") " pod="metallb-system/metallb-operator-webhook-server-6f489b594c-qqhv4" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.681393 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6f489b594c-qqhv4" Dec 01 09:22:16 crc kubenswrapper[4867]: I1201 09:22:16.695158 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d79d8d46b-pxjk5"] Dec 01 09:22:17 crc kubenswrapper[4867]: I1201 09:22:17.028427 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6d79d8d46b-pxjk5" event={"ID":"82e433dd-78d1-4cb0-a670-e19c67e09515","Type":"ContainerStarted","Data":"2d845f3e3f669295cbba598eed2e59812833aea9acb7d586ce284cdde5bd9dd4"} Dec 01 09:22:17 crc kubenswrapper[4867]: I1201 09:22:17.235778 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6f489b594c-qqhv4"] Dec 01 09:22:17 crc kubenswrapper[4867]: W1201 09:22:17.240097 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb69f179_7caf_472c_9b20_f327c116f4a2.slice/crio-d053e0c3ef17250fe6ee2db16060b82ddb1cba62c5ba8b0ba0ecc82b9c79e1b8 WatchSource:0}: Error finding container d053e0c3ef17250fe6ee2db16060b82ddb1cba62c5ba8b0ba0ecc82b9c79e1b8: Status 404 returned error can't find the container with id d053e0c3ef17250fe6ee2db16060b82ddb1cba62c5ba8b0ba0ecc82b9c79e1b8 Dec 01 09:22:18 crc kubenswrapper[4867]: I1201 09:22:18.037020 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6f489b594c-qqhv4" event={"ID":"cb69f179-7caf-472c-9b20-f327c116f4a2","Type":"ContainerStarted","Data":"d053e0c3ef17250fe6ee2db16060b82ddb1cba62c5ba8b0ba0ecc82b9c79e1b8"} Dec 01 09:22:24 crc kubenswrapper[4867]: I1201 09:22:24.076379 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6d79d8d46b-pxjk5" event={"ID":"82e433dd-78d1-4cb0-a670-e19c67e09515","Type":"ContainerStarted","Data":"060a9b76ee6a7ae30fd5abb6d9656770987429b3f71f8d754cf725f3e746ad33"} Dec 01 09:22:24 crc kubenswrapper[4867]: I1201 09:22:24.077824 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6d79d8d46b-pxjk5" Dec 01 09:22:24 crc kubenswrapper[4867]: I1201 09:22:24.077962 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6f489b594c-qqhv4" event={"ID":"cb69f179-7caf-472c-9b20-f327c116f4a2","Type":"ContainerStarted","Data":"42c0513c91e0d0b0e7a23ff0831b0114b30a8a1fe1dcb440920e8bb18ba345f6"} Dec 01 09:22:24 crc kubenswrapper[4867]: I1201 09:22:24.078086 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6f489b594c-qqhv4" Dec 01 09:22:24 crc kubenswrapper[4867]: I1201 09:22:24.124792 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6f489b594c-qqhv4" podStartSLOduration=2.015772446 podStartE2EDuration="8.124772765s" podCreationTimestamp="2025-12-01 09:22:16 +0000 UTC" firstStartedPulling="2025-12-01 09:22:17.243162901 +0000 UTC m=+858.702549665" lastFinishedPulling="2025-12-01 09:22:23.35216323 +0000 UTC m=+864.811549984" observedRunningTime="2025-12-01 09:22:24.123292344 +0000 UTC m=+865.582679098" watchObservedRunningTime="2025-12-01 09:22:24.124772765 +0000 UTC m=+865.584159519" Dec 01 09:22:24 crc kubenswrapper[4867]: I1201 09:22:24.126936 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6d79d8d46b-pxjk5" podStartSLOduration=1.4898604180000001 podStartE2EDuration="8.126928764s" podCreationTimestamp="2025-12-01 09:22:16 +0000 UTC" firstStartedPulling="2025-12-01 09:22:16.713984374 +0000 UTC m=+858.173371128" lastFinishedPulling="2025-12-01 09:22:23.3510527 +0000 UTC m=+864.810439474" observedRunningTime="2025-12-01 09:22:24.105194769 +0000 UTC m=+865.564581523" watchObservedRunningTime="2025-12-01 09:22:24.126928764 +0000 UTC m=+865.586315518" Dec 01 09:22:36 crc kubenswrapper[4867]: I1201 09:22:36.687678 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6f489b594c-qqhv4" Dec 01 09:22:51 crc kubenswrapper[4867]: I1201 09:22:51.601178 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:22:51 crc kubenswrapper[4867]: I1201 09:22:51.601515 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:22:56 crc kubenswrapper[4867]: I1201 09:22:56.387782 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6d79d8d46b-pxjk5" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.119289 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-stgvd"] Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.122169 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.124984 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-4bvvs"] Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.125563 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.125902 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4bvvs" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.125956 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.127289 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-p65h8" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.127499 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.142196 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-4bvvs"] Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.228250 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b1d4168c-add2-4db2-a491-761b0127d5b1-reloader\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.228303 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b1d4168c-add2-4db2-a491-761b0127d5b1-frr-sockets\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.228334 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b1d4168c-add2-4db2-a491-761b0127d5b1-frr-startup\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.228355 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1d4168c-add2-4db2-a491-761b0127d5b1-metrics-certs\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.228383 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b1d4168c-add2-4db2-a491-761b0127d5b1-metrics\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.228541 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62vbq\" (UniqueName: \"kubernetes.io/projected/b1d4168c-add2-4db2-a491-761b0127d5b1-kube-api-access-62vbq\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.228608 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnvm2\" (UniqueName: \"kubernetes.io/projected/d2bcb3a5-5fb9-4c77-9f79-6d88033b8669-kube-api-access-tnvm2\") pod \"frr-k8s-webhook-server-7fcb986d4-4bvvs\" (UID: \"d2bcb3a5-5fb9-4c77-9f79-6d88033b8669\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4bvvs" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.228643 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b1d4168c-add2-4db2-a491-761b0127d5b1-frr-conf\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.228772 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2bcb3a5-5fb9-4c77-9f79-6d88033b8669-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-4bvvs\" (UID: \"d2bcb3a5-5fb9-4c77-9f79-6d88033b8669\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4bvvs" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.237937 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-hfnbg"] Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.238881 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-hfnbg" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.240668 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.245684 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-bczj5"] Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.246757 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bczj5" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.248474 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.250550 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-9vzvx" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.250557 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.251477 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.282720 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-hfnbg"] Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.329962 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b1d4168c-add2-4db2-a491-761b0127d5b1-reloader\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.330014 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b1d4168c-add2-4db2-a491-761b0127d5b1-frr-sockets\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.330038 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b1d4168c-add2-4db2-a491-761b0127d5b1-frr-startup\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.330067 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1d4168c-add2-4db2-a491-761b0127d5b1-metrics-certs\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.330104 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b1d4168c-add2-4db2-a491-761b0127d5b1-metrics\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.330131 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62vbq\" (UniqueName: \"kubernetes.io/projected/b1d4168c-add2-4db2-a491-761b0127d5b1-kube-api-access-62vbq\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.330163 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnvm2\" (UniqueName: \"kubernetes.io/projected/d2bcb3a5-5fb9-4c77-9f79-6d88033b8669-kube-api-access-tnvm2\") pod \"frr-k8s-webhook-server-7fcb986d4-4bvvs\" (UID: \"d2bcb3a5-5fb9-4c77-9f79-6d88033b8669\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4bvvs" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.330188 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b1d4168c-add2-4db2-a491-761b0127d5b1-frr-conf\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.330229 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2bcb3a5-5fb9-4c77-9f79-6d88033b8669-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-4bvvs\" (UID: \"d2bcb3a5-5fb9-4c77-9f79-6d88033b8669\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4bvvs" Dec 01 09:22:57 crc kubenswrapper[4867]: E1201 09:22:57.330349 4867 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 01 09:22:57 crc kubenswrapper[4867]: E1201 09:22:57.330404 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2bcb3a5-5fb9-4c77-9f79-6d88033b8669-cert podName:d2bcb3a5-5fb9-4c77-9f79-6d88033b8669 nodeName:}" failed. No retries permitted until 2025-12-01 09:22:57.830384164 +0000 UTC m=+899.289770918 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d2bcb3a5-5fb9-4c77-9f79-6d88033b8669-cert") pod "frr-k8s-webhook-server-7fcb986d4-4bvvs" (UID: "d2bcb3a5-5fb9-4c77-9f79-6d88033b8669") : secret "frr-k8s-webhook-server-cert" not found Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.330520 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b1d4168c-add2-4db2-a491-761b0127d5b1-frr-sockets\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.330582 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b1d4168c-add2-4db2-a491-761b0127d5b1-reloader\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.330842 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b1d4168c-add2-4db2-a491-761b0127d5b1-frr-conf\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: E1201 09:22:57.330910 4867 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 01 09:22:57 crc kubenswrapper[4867]: E1201 09:22:57.330984 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1d4168c-add2-4db2-a491-761b0127d5b1-metrics-certs podName:b1d4168c-add2-4db2-a491-761b0127d5b1 nodeName:}" failed. No retries permitted until 2025-12-01 09:22:57.83097407 +0000 UTC m=+899.290360904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1d4168c-add2-4db2-a491-761b0127d5b1-metrics-certs") pod "frr-k8s-stgvd" (UID: "b1d4168c-add2-4db2-a491-761b0127d5b1") : secret "frr-k8s-certs-secret" not found Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.331072 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b1d4168c-add2-4db2-a491-761b0127d5b1-metrics\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.331569 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b1d4168c-add2-4db2-a491-761b0127d5b1-frr-startup\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.352592 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnvm2\" (UniqueName: \"kubernetes.io/projected/d2bcb3a5-5fb9-4c77-9f79-6d88033b8669-kube-api-access-tnvm2\") pod \"frr-k8s-webhook-server-7fcb986d4-4bvvs\" (UID: \"d2bcb3a5-5fb9-4c77-9f79-6d88033b8669\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4bvvs" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.354936 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62vbq\" (UniqueName: \"kubernetes.io/projected/b1d4168c-add2-4db2-a491-761b0127d5b1-kube-api-access-62vbq\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.431709 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3-metrics-certs\") pod \"speaker-bczj5\" (UID: \"c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3\") " pod="metallb-system/speaker-bczj5" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.432057 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a-cert\") pod \"controller-f8648f98b-hfnbg\" (UID: \"cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a\") " pod="metallb-system/controller-f8648f98b-hfnbg" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.432116 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv2cw\" (UniqueName: \"kubernetes.io/projected/c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3-kube-api-access-gv2cw\") pod \"speaker-bczj5\" (UID: \"c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3\") " pod="metallb-system/speaker-bczj5" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.432136 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a-metrics-certs\") pod \"controller-f8648f98b-hfnbg\" (UID: \"cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a\") " pod="metallb-system/controller-f8648f98b-hfnbg" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.432157 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxjsn\" (UniqueName: \"kubernetes.io/projected/cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a-kube-api-access-nxjsn\") pod \"controller-f8648f98b-hfnbg\" (UID: \"cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a\") " pod="metallb-system/controller-f8648f98b-hfnbg" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.432187 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3-metallb-excludel2\") pod \"speaker-bczj5\" (UID: \"c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3\") " pod="metallb-system/speaker-bczj5" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.432215 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3-memberlist\") pod \"speaker-bczj5\" (UID: \"c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3\") " pod="metallb-system/speaker-bczj5" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.533724 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv2cw\" (UniqueName: \"kubernetes.io/projected/c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3-kube-api-access-gv2cw\") pod \"speaker-bczj5\" (UID: \"c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3\") " pod="metallb-system/speaker-bczj5" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.533776 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a-metrics-certs\") pod \"controller-f8648f98b-hfnbg\" (UID: \"cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a\") " pod="metallb-system/controller-f8648f98b-hfnbg" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.533833 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxjsn\" (UniqueName: \"kubernetes.io/projected/cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a-kube-api-access-nxjsn\") pod \"controller-f8648f98b-hfnbg\" (UID: \"cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a\") " pod="metallb-system/controller-f8648f98b-hfnbg" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.533888 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3-metallb-excludel2\") pod \"speaker-bczj5\" (UID: \"c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3\") " pod="metallb-system/speaker-bczj5" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.533940 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3-memberlist\") pod \"speaker-bczj5\" (UID: \"c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3\") " pod="metallb-system/speaker-bczj5" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.533975 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3-metrics-certs\") pod \"speaker-bczj5\" (UID: \"c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3\") " pod="metallb-system/speaker-bczj5" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.534017 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a-cert\") pod \"controller-f8648f98b-hfnbg\" (UID: \"cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a\") " pod="metallb-system/controller-f8648f98b-hfnbg" Dec 01 09:22:57 crc kubenswrapper[4867]: E1201 09:22:57.534153 4867 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 09:22:57 crc kubenswrapper[4867]: E1201 09:22:57.534219 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3-memberlist podName:c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3 nodeName:}" failed. No retries permitted until 2025-12-01 09:22:58.034199008 +0000 UTC m=+899.493585862 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3-memberlist") pod "speaker-bczj5" (UID: "c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3") : secret "metallb-memberlist" not found Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.535224 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3-metallb-excludel2\") pod \"speaker-bczj5\" (UID: \"c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3\") " pod="metallb-system/speaker-bczj5" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.537401 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3-metrics-certs\") pod \"speaker-bczj5\" (UID: \"c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3\") " pod="metallb-system/speaker-bczj5" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.543613 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a-metrics-certs\") pod \"controller-f8648f98b-hfnbg\" (UID: \"cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a\") " pod="metallb-system/controller-f8648f98b-hfnbg" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.545847 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a-cert\") pod \"controller-f8648f98b-hfnbg\" (UID: \"cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a\") " pod="metallb-system/controller-f8648f98b-hfnbg" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.549205 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv2cw\" (UniqueName: \"kubernetes.io/projected/c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3-kube-api-access-gv2cw\") pod \"speaker-bczj5\" (UID: \"c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3\") " pod="metallb-system/speaker-bczj5" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.549296 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxjsn\" (UniqueName: \"kubernetes.io/projected/cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a-kube-api-access-nxjsn\") pod \"controller-f8648f98b-hfnbg\" (UID: \"cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a\") " pod="metallb-system/controller-f8648f98b-hfnbg" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.560313 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-hfnbg" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.837435 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2bcb3a5-5fb9-4c77-9f79-6d88033b8669-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-4bvvs\" (UID: \"d2bcb3a5-5fb9-4c77-9f79-6d88033b8669\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4bvvs" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.837521 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1d4168c-add2-4db2-a491-761b0127d5b1-metrics-certs\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.845472 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1d4168c-add2-4db2-a491-761b0127d5b1-metrics-certs\") pod \"frr-k8s-stgvd\" (UID: \"b1d4168c-add2-4db2-a491-761b0127d5b1\") " pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:57 crc kubenswrapper[4867]: I1201 09:22:57.847368 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2bcb3a5-5fb9-4c77-9f79-6d88033b8669-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-4bvvs\" (UID: \"d2bcb3a5-5fb9-4c77-9f79-6d88033b8669\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4bvvs" Dec 01 09:22:58 crc kubenswrapper[4867]: I1201 09:22:58.040251 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3-memberlist\") pod \"speaker-bczj5\" (UID: \"c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3\") " pod="metallb-system/speaker-bczj5" Dec 01 09:22:58 crc kubenswrapper[4867]: E1201 09:22:58.040426 4867 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 09:22:58 crc kubenswrapper[4867]: E1201 09:22:58.040479 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3-memberlist podName:c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3 nodeName:}" failed. No retries permitted until 2025-12-01 09:22:59.040462937 +0000 UTC m=+900.499849691 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3-memberlist") pod "speaker-bczj5" (UID: "c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3") : secret "metallb-memberlist" not found Dec 01 09:22:58 crc kubenswrapper[4867]: I1201 09:22:58.048037 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-hfnbg"] Dec 01 09:22:58 crc kubenswrapper[4867]: I1201 09:22:58.050753 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-stgvd" Dec 01 09:22:58 crc kubenswrapper[4867]: I1201 09:22:58.058246 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4bvvs" Dec 01 09:22:58 crc kubenswrapper[4867]: I1201 09:22:58.255613 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-hfnbg" event={"ID":"cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a","Type":"ContainerStarted","Data":"c1e759d55f1856872fedf8804086b68a44b27c42bfe5002cd30f32bc69c8a769"} Dec 01 09:22:58 crc kubenswrapper[4867]: W1201 09:22:58.499827 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2bcb3a5_5fb9_4c77_9f79_6d88033b8669.slice/crio-9e2da2c00888f238d099905738ec8dc4e8e6d83eedbd26e2893fe8a9448d7e5c WatchSource:0}: Error finding container 9e2da2c00888f238d099905738ec8dc4e8e6d83eedbd26e2893fe8a9448d7e5c: Status 404 returned error can't find the container with id 9e2da2c00888f238d099905738ec8dc4e8e6d83eedbd26e2893fe8a9448d7e5c Dec 01 09:22:58 crc kubenswrapper[4867]: I1201 09:22:58.500146 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-4bvvs"] Dec 01 09:22:59 crc kubenswrapper[4867]: I1201 09:22:59.051061 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3-memberlist\") pod \"speaker-bczj5\" (UID: \"c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3\") " pod="metallb-system/speaker-bczj5" Dec 01 09:22:59 crc kubenswrapper[4867]: I1201 09:22:59.069484 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3-memberlist\") pod \"speaker-bczj5\" (UID: \"c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3\") " pod="metallb-system/speaker-bczj5" Dec 01 09:22:59 crc kubenswrapper[4867]: I1201 09:22:59.091445 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-9vzvx" Dec 01 09:22:59 crc kubenswrapper[4867]: I1201 09:22:59.095520 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bczj5" Dec 01 09:22:59 crc kubenswrapper[4867]: W1201 09:22:59.123387 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7f81cd9_bd19_4ed2_95d1_f8bb6fc5d6b3.slice/crio-b995e2b86faa6315c7c00b27076006a78ffb110d2a5c01c3b271c490a4801de7 WatchSource:0}: Error finding container b995e2b86faa6315c7c00b27076006a78ffb110d2a5c01c3b271c490a4801de7: Status 404 returned error can't find the container with id b995e2b86faa6315c7c00b27076006a78ffb110d2a5c01c3b271c490a4801de7 Dec 01 09:22:59 crc kubenswrapper[4867]: I1201 09:22:59.264732 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-stgvd" event={"ID":"b1d4168c-add2-4db2-a491-761b0127d5b1","Type":"ContainerStarted","Data":"0b737de99a2adc083c1f519ff2a4b282a811b33fb000cd26c8e2d3b4c6fbdfff"} Dec 01 09:22:59 crc kubenswrapper[4867]: I1201 09:22:59.267193 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-hfnbg" event={"ID":"cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a","Type":"ContainerStarted","Data":"65b09b4a12e59b852f9c290b4b7b71ac8a898c73832e53f4f081fc496ca27f3c"} Dec 01 09:22:59 crc kubenswrapper[4867]: I1201 09:22:59.267226 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-hfnbg" event={"ID":"cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a","Type":"ContainerStarted","Data":"ef4f5d66d225a724113a73d3ec91adf1aebd3012c7f7383386e80612e6477712"} Dec 01 09:22:59 crc kubenswrapper[4867]: I1201 09:22:59.267953 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-hfnbg" Dec 01 09:22:59 crc kubenswrapper[4867]: I1201 09:22:59.269456 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4bvvs" event={"ID":"d2bcb3a5-5fb9-4c77-9f79-6d88033b8669","Type":"ContainerStarted","Data":"9e2da2c00888f238d099905738ec8dc4e8e6d83eedbd26e2893fe8a9448d7e5c"} Dec 01 09:22:59 crc kubenswrapper[4867]: I1201 09:22:59.271968 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bczj5" event={"ID":"c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3","Type":"ContainerStarted","Data":"b995e2b86faa6315c7c00b27076006a78ffb110d2a5c01c3b271c490a4801de7"} Dec 01 09:22:59 crc kubenswrapper[4867]: I1201 09:22:59.293505 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-hfnbg" podStartSLOduration=2.293483352 podStartE2EDuration="2.293483352s" podCreationTimestamp="2025-12-01 09:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:22:59.290225543 +0000 UTC m=+900.749612297" watchObservedRunningTime="2025-12-01 09:22:59.293483352 +0000 UTC m=+900.752870106" Dec 01 09:23:00 crc kubenswrapper[4867]: I1201 09:23:00.285191 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bczj5" event={"ID":"c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3","Type":"ContainerStarted","Data":"c5f79254fc986550c66ae95e5f9859d96eac75b5a6b7070ec474c695675867cf"} Dec 01 09:23:00 crc kubenswrapper[4867]: I1201 09:23:00.285483 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-bczj5" Dec 01 09:23:00 crc kubenswrapper[4867]: I1201 09:23:00.285495 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bczj5" event={"ID":"c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3","Type":"ContainerStarted","Data":"61ed74d3a2b9b368976e70e11f1373ec594dee324446f8ab73f300a87799c522"} Dec 01 09:23:00 crc kubenswrapper[4867]: I1201 09:23:00.309341 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-bczj5" podStartSLOduration=3.309324259 podStartE2EDuration="3.309324259s" podCreationTimestamp="2025-12-01 09:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:23:00.305473784 +0000 UTC m=+901.764860548" watchObservedRunningTime="2025-12-01 09:23:00.309324259 +0000 UTC m=+901.768711013" Dec 01 09:23:07 crc kubenswrapper[4867]: I1201 09:23:07.335944 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4bvvs" event={"ID":"d2bcb3a5-5fb9-4c77-9f79-6d88033b8669","Type":"ContainerStarted","Data":"6d20f5d12b8a94c315c06f06fa567621fb4dd80640383deb7cb4d82082b9dc81"} Dec 01 09:23:07 crc kubenswrapper[4867]: I1201 09:23:07.336462 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4bvvs" Dec 01 09:23:07 crc kubenswrapper[4867]: I1201 09:23:07.338713 4867 generic.go:334] "Generic (PLEG): container finished" podID="b1d4168c-add2-4db2-a491-761b0127d5b1" containerID="8a963d6bbd211abab6b5da47387b122bbf7c647352c2d9195bd502b448462a8d" exitCode=0 Dec 01 09:23:07 crc kubenswrapper[4867]: I1201 09:23:07.338775 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-stgvd" event={"ID":"b1d4168c-add2-4db2-a491-761b0127d5b1","Type":"ContainerDied","Data":"8a963d6bbd211abab6b5da47387b122bbf7c647352c2d9195bd502b448462a8d"} Dec 01 09:23:07 crc kubenswrapper[4867]: I1201 09:23:07.397697 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4bvvs" podStartSLOduration=2.330496996 podStartE2EDuration="10.397678247s" podCreationTimestamp="2025-12-01 09:22:57 +0000 UTC" firstStartedPulling="2025-12-01 09:22:58.514959485 +0000 UTC m=+899.974346239" lastFinishedPulling="2025-12-01 09:23:06.582140736 +0000 UTC m=+908.041527490" observedRunningTime="2025-12-01 09:23:07.356000925 +0000 UTC m=+908.815387679" watchObservedRunningTime="2025-12-01 09:23:07.397678247 +0000 UTC m=+908.857065011" Dec 01 09:23:08 crc kubenswrapper[4867]: I1201 09:23:08.345688 4867 generic.go:334] "Generic (PLEG): container finished" podID="b1d4168c-add2-4db2-a491-761b0127d5b1" containerID="ad7f198012242713b526ca41c08e3436f3432f6a84623aba5d9c3563f1d9634d" exitCode=0 Dec 01 09:23:08 crc kubenswrapper[4867]: I1201 09:23:08.345754 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-stgvd" event={"ID":"b1d4168c-add2-4db2-a491-761b0127d5b1","Type":"ContainerDied","Data":"ad7f198012242713b526ca41c08e3436f3432f6a84623aba5d9c3563f1d9634d"} Dec 01 09:23:09 crc kubenswrapper[4867]: I1201 09:23:09.098713 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-bczj5" Dec 01 09:23:09 crc kubenswrapper[4867]: I1201 09:23:09.352518 4867 generic.go:334] "Generic (PLEG): container finished" podID="b1d4168c-add2-4db2-a491-761b0127d5b1" containerID="3e38527fd6ca3be78f4c0b0a578129cf9aedef3955cfa6bbe666d78cac41a351" exitCode=0 Dec 01 09:23:09 crc kubenswrapper[4867]: I1201 09:23:09.352558 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-stgvd" event={"ID":"b1d4168c-add2-4db2-a491-761b0127d5b1","Type":"ContainerDied","Data":"3e38527fd6ca3be78f4c0b0a578129cf9aedef3955cfa6bbe666d78cac41a351"} Dec 01 09:23:10 crc kubenswrapper[4867]: I1201 09:23:10.364079 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-stgvd" event={"ID":"b1d4168c-add2-4db2-a491-761b0127d5b1","Type":"ContainerStarted","Data":"383d5c14ac032f5a5aab247270fb334ca63050e00a6c157243aec2d012e97e76"} Dec 01 09:23:11 crc kubenswrapper[4867]: I1201 09:23:11.373803 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-stgvd" event={"ID":"b1d4168c-add2-4db2-a491-761b0127d5b1","Type":"ContainerStarted","Data":"8330e1d8c4dc13585a8bb4cb91e39dc369c195831df88d13ddbff7f1aea77eb1"} Dec 01 09:23:11 crc kubenswrapper[4867]: I1201 09:23:11.374156 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-stgvd" Dec 01 09:23:11 crc kubenswrapper[4867]: I1201 09:23:11.374169 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-stgvd" event={"ID":"b1d4168c-add2-4db2-a491-761b0127d5b1","Type":"ContainerStarted","Data":"b2786b59f4facd49e97b3384687cd03f6dd6e3e8d5175e4a937dbb66bc370c96"} Dec 01 09:23:11 crc kubenswrapper[4867]: I1201 09:23:11.374181 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-stgvd" event={"ID":"b1d4168c-add2-4db2-a491-761b0127d5b1","Type":"ContainerStarted","Data":"4c7a592c5635fe51010e3163e8e033dfdd3049cf23dfac1339f77558b052cb07"} Dec 01 09:23:11 crc kubenswrapper[4867]: I1201 09:23:11.374191 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-stgvd" event={"ID":"b1d4168c-add2-4db2-a491-761b0127d5b1","Type":"ContainerStarted","Data":"001959fb51c7f8d260966cd4dda4eff04c6b67b87e8777d75df5fefb43cb8e6f"} Dec 01 09:23:11 crc kubenswrapper[4867]: I1201 09:23:11.374201 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-stgvd" event={"ID":"b1d4168c-add2-4db2-a491-761b0127d5b1","Type":"ContainerStarted","Data":"0593164cb3bec7eed451dda5ae05124b95d277fc4a35c7f37766d5fc8a0d322e"} Dec 01 09:23:11 crc kubenswrapper[4867]: I1201 09:23:11.399160 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-stgvd" podStartSLOduration=6.118686457 podStartE2EDuration="14.399140221s" podCreationTimestamp="2025-12-01 09:22:57 +0000 UTC" firstStartedPulling="2025-12-01 09:22:58.301564029 +0000 UTC m=+899.760950783" lastFinishedPulling="2025-12-01 09:23:06.582017793 +0000 UTC m=+908.041404547" observedRunningTime="2025-12-01 09:23:11.392707755 +0000 UTC m=+912.852094509" watchObservedRunningTime="2025-12-01 09:23:11.399140221 +0000 UTC m=+912.858526975" Dec 01 09:23:12 crc kubenswrapper[4867]: I1201 09:23:12.480207 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-j72pm"] Dec 01 09:23:12 crc kubenswrapper[4867]: I1201 09:23:12.481422 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-j72pm" Dec 01 09:23:12 crc kubenswrapper[4867]: I1201 09:23:12.483501 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-wthqb" Dec 01 09:23:12 crc kubenswrapper[4867]: I1201 09:23:12.483841 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 01 09:23:12 crc kubenswrapper[4867]: I1201 09:23:12.491916 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-j72pm"] Dec 01 09:23:12 crc kubenswrapper[4867]: I1201 09:23:12.493850 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 01 09:23:12 crc kubenswrapper[4867]: I1201 09:23:12.637298 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tz7k\" (UniqueName: \"kubernetes.io/projected/e87455bb-687d-418a-aa19-421092c0b48b-kube-api-access-6tz7k\") pod \"openstack-operator-index-j72pm\" (UID: \"e87455bb-687d-418a-aa19-421092c0b48b\") " pod="openstack-operators/openstack-operator-index-j72pm" Dec 01 09:23:12 crc kubenswrapper[4867]: I1201 09:23:12.738547 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tz7k\" (UniqueName: \"kubernetes.io/projected/e87455bb-687d-418a-aa19-421092c0b48b-kube-api-access-6tz7k\") pod \"openstack-operator-index-j72pm\" (UID: \"e87455bb-687d-418a-aa19-421092c0b48b\") " pod="openstack-operators/openstack-operator-index-j72pm" Dec 01 09:23:12 crc kubenswrapper[4867]: I1201 09:23:12.782534 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tz7k\" (UniqueName: \"kubernetes.io/projected/e87455bb-687d-418a-aa19-421092c0b48b-kube-api-access-6tz7k\") pod \"openstack-operator-index-j72pm\" (UID: \"e87455bb-687d-418a-aa19-421092c0b48b\") " pod="openstack-operators/openstack-operator-index-j72pm" Dec 01 09:23:12 crc kubenswrapper[4867]: I1201 09:23:12.798926 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-j72pm" Dec 01 09:23:13 crc kubenswrapper[4867]: I1201 09:23:13.053103 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-stgvd" Dec 01 09:23:13 crc kubenswrapper[4867]: I1201 09:23:13.102117 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-j72pm"] Dec 01 09:23:13 crc kubenswrapper[4867]: I1201 09:23:13.120204 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-stgvd" Dec 01 09:23:13 crc kubenswrapper[4867]: I1201 09:23:13.418994 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-j72pm" event={"ID":"e87455bb-687d-418a-aa19-421092c0b48b","Type":"ContainerStarted","Data":"e420861b092dcaea5108a2c8f3fa789497d394c80ddbe5fa07b6fc20848f944c"} Dec 01 09:23:16 crc kubenswrapper[4867]: I1201 09:23:16.527772 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-j72pm"] Dec 01 09:23:17 crc kubenswrapper[4867]: I1201 09:23:17.134872 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-zvsh9"] Dec 01 09:23:17 crc kubenswrapper[4867]: I1201 09:23:17.136212 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zvsh9" Dec 01 09:23:17 crc kubenswrapper[4867]: I1201 09:23:17.151222 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zvsh9"] Dec 01 09:23:17 crc kubenswrapper[4867]: I1201 09:23:17.222355 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgfp4\" (UniqueName: \"kubernetes.io/projected/cbb9c171-f076-44a2-9a0a-fafd9aa101ca-kube-api-access-sgfp4\") pod \"openstack-operator-index-zvsh9\" (UID: \"cbb9c171-f076-44a2-9a0a-fafd9aa101ca\") " pod="openstack-operators/openstack-operator-index-zvsh9" Dec 01 09:23:17 crc kubenswrapper[4867]: I1201 09:23:17.323195 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgfp4\" (UniqueName: \"kubernetes.io/projected/cbb9c171-f076-44a2-9a0a-fafd9aa101ca-kube-api-access-sgfp4\") pod \"openstack-operator-index-zvsh9\" (UID: \"cbb9c171-f076-44a2-9a0a-fafd9aa101ca\") " pod="openstack-operators/openstack-operator-index-zvsh9" Dec 01 09:23:17 crc kubenswrapper[4867]: I1201 09:23:17.342110 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgfp4\" (UniqueName: \"kubernetes.io/projected/cbb9c171-f076-44a2-9a0a-fafd9aa101ca-kube-api-access-sgfp4\") pod \"openstack-operator-index-zvsh9\" (UID: \"cbb9c171-f076-44a2-9a0a-fafd9aa101ca\") " pod="openstack-operators/openstack-operator-index-zvsh9" Dec 01 09:23:17 crc kubenswrapper[4867]: I1201 09:23:17.451260 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-j72pm" event={"ID":"e87455bb-687d-418a-aa19-421092c0b48b","Type":"ContainerStarted","Data":"2fc32c31ce60f7b42a631d69dceed10d4f58a7dcc832c27d369518026bc0c17d"} Dec 01 09:23:17 crc kubenswrapper[4867]: I1201 09:23:17.451373 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-j72pm" podUID="e87455bb-687d-418a-aa19-421092c0b48b" containerName="registry-server" containerID="cri-o://2fc32c31ce60f7b42a631d69dceed10d4f58a7dcc832c27d369518026bc0c17d" gracePeriod=2 Dec 01 09:23:17 crc kubenswrapper[4867]: I1201 09:23:17.467705 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-j72pm" podStartSLOduration=1.840299893 podStartE2EDuration="5.467685351s" podCreationTimestamp="2025-12-01 09:23:12 +0000 UTC" firstStartedPulling="2025-12-01 09:23:13.123755365 +0000 UTC m=+914.583142119" lastFinishedPulling="2025-12-01 09:23:16.751140823 +0000 UTC m=+918.210527577" observedRunningTime="2025-12-01 09:23:17.464307119 +0000 UTC m=+918.923693893" watchObservedRunningTime="2025-12-01 09:23:17.467685351 +0000 UTC m=+918.927072095" Dec 01 09:23:17 crc kubenswrapper[4867]: I1201 09:23:17.504280 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zvsh9" Dec 01 09:23:17 crc kubenswrapper[4867]: I1201 09:23:17.566918 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-hfnbg" Dec 01 09:23:17 crc kubenswrapper[4867]: I1201 09:23:17.845979 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-j72pm" Dec 01 09:23:17 crc kubenswrapper[4867]: I1201 09:23:17.933194 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tz7k\" (UniqueName: \"kubernetes.io/projected/e87455bb-687d-418a-aa19-421092c0b48b-kube-api-access-6tz7k\") pod \"e87455bb-687d-418a-aa19-421092c0b48b\" (UID: \"e87455bb-687d-418a-aa19-421092c0b48b\") " Dec 01 09:23:17 crc kubenswrapper[4867]: I1201 09:23:17.937397 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e87455bb-687d-418a-aa19-421092c0b48b-kube-api-access-6tz7k" (OuterVolumeSpecName: "kube-api-access-6tz7k") pod "e87455bb-687d-418a-aa19-421092c0b48b" (UID: "e87455bb-687d-418a-aa19-421092c0b48b"). InnerVolumeSpecName "kube-api-access-6tz7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:23:17 crc kubenswrapper[4867]: I1201 09:23:17.938762 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zvsh9"] Dec 01 09:23:17 crc kubenswrapper[4867]: W1201 09:23:17.939937 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbb9c171_f076_44a2_9a0a_fafd9aa101ca.slice/crio-863627f0cf94de9af6923836d535f7490fe7dcb2d2e524635729ef7578f51a5c WatchSource:0}: Error finding container 863627f0cf94de9af6923836d535f7490fe7dcb2d2e524635729ef7578f51a5c: Status 404 returned error can't find the container with id 863627f0cf94de9af6923836d535f7490fe7dcb2d2e524635729ef7578f51a5c Dec 01 09:23:18 crc kubenswrapper[4867]: I1201 09:23:18.034321 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tz7k\" (UniqueName: \"kubernetes.io/projected/e87455bb-687d-418a-aa19-421092c0b48b-kube-api-access-6tz7k\") on node \"crc\" DevicePath \"\"" Dec 01 09:23:18 crc kubenswrapper[4867]: I1201 09:23:18.062634 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4bvvs" Dec 01 09:23:18 crc kubenswrapper[4867]: I1201 09:23:18.457585 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zvsh9" event={"ID":"cbb9c171-f076-44a2-9a0a-fafd9aa101ca","Type":"ContainerStarted","Data":"28f60d1701821e33ea742329efa380f2418de3f3355a824a39ddab1f6e235274"} Dec 01 09:23:18 crc kubenswrapper[4867]: I1201 09:23:18.457628 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zvsh9" event={"ID":"cbb9c171-f076-44a2-9a0a-fafd9aa101ca","Type":"ContainerStarted","Data":"863627f0cf94de9af6923836d535f7490fe7dcb2d2e524635729ef7578f51a5c"} Dec 01 09:23:18 crc kubenswrapper[4867]: I1201 09:23:18.459437 4867 generic.go:334] "Generic (PLEG): container finished" podID="e87455bb-687d-418a-aa19-421092c0b48b" containerID="2fc32c31ce60f7b42a631d69dceed10d4f58a7dcc832c27d369518026bc0c17d" exitCode=0 Dec 01 09:23:18 crc kubenswrapper[4867]: I1201 09:23:18.459478 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-j72pm" event={"ID":"e87455bb-687d-418a-aa19-421092c0b48b","Type":"ContainerDied","Data":"2fc32c31ce60f7b42a631d69dceed10d4f58a7dcc832c27d369518026bc0c17d"} Dec 01 09:23:18 crc kubenswrapper[4867]: I1201 09:23:18.459537 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-j72pm" event={"ID":"e87455bb-687d-418a-aa19-421092c0b48b","Type":"ContainerDied","Data":"e420861b092dcaea5108a2c8f3fa789497d394c80ddbe5fa07b6fc20848f944c"} Dec 01 09:23:18 crc kubenswrapper[4867]: I1201 09:23:18.459559 4867 scope.go:117] "RemoveContainer" containerID="2fc32c31ce60f7b42a631d69dceed10d4f58a7dcc832c27d369518026bc0c17d" Dec 01 09:23:18 crc kubenswrapper[4867]: I1201 09:23:18.459494 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-j72pm" Dec 01 09:23:18 crc kubenswrapper[4867]: I1201 09:23:18.475235 4867 scope.go:117] "RemoveContainer" containerID="2fc32c31ce60f7b42a631d69dceed10d4f58a7dcc832c27d369518026bc0c17d" Dec 01 09:23:18 crc kubenswrapper[4867]: E1201 09:23:18.476015 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fc32c31ce60f7b42a631d69dceed10d4f58a7dcc832c27d369518026bc0c17d\": container with ID starting with 2fc32c31ce60f7b42a631d69dceed10d4f58a7dcc832c27d369518026bc0c17d not found: ID does not exist" containerID="2fc32c31ce60f7b42a631d69dceed10d4f58a7dcc832c27d369518026bc0c17d" Dec 01 09:23:18 crc kubenswrapper[4867]: I1201 09:23:18.476066 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fc32c31ce60f7b42a631d69dceed10d4f58a7dcc832c27d369518026bc0c17d"} err="failed to get container status \"2fc32c31ce60f7b42a631d69dceed10d4f58a7dcc832c27d369518026bc0c17d\": rpc error: code = NotFound desc = could not find container \"2fc32c31ce60f7b42a631d69dceed10d4f58a7dcc832c27d369518026bc0c17d\": container with ID starting with 2fc32c31ce60f7b42a631d69dceed10d4f58a7dcc832c27d369518026bc0c17d not found: ID does not exist" Dec 01 09:23:18 crc kubenswrapper[4867]: I1201 09:23:18.480457 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-zvsh9" podStartSLOduration=1.425682116 podStartE2EDuration="1.480439355s" podCreationTimestamp="2025-12-01 09:23:17 +0000 UTC" firstStartedPulling="2025-12-01 09:23:17.944273008 +0000 UTC m=+919.403659752" lastFinishedPulling="2025-12-01 09:23:17.999030237 +0000 UTC m=+919.458416991" observedRunningTime="2025-12-01 09:23:18.478410029 +0000 UTC m=+919.937796783" watchObservedRunningTime="2025-12-01 09:23:18.480439355 +0000 UTC m=+919.939826109" Dec 01 09:23:18 crc kubenswrapper[4867]: I1201 09:23:18.492902 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-j72pm"] Dec 01 09:23:18 crc kubenswrapper[4867]: I1201 09:23:18.496424 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-j72pm"] Dec 01 09:23:18 crc kubenswrapper[4867]: I1201 09:23:18.834691 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e87455bb-687d-418a-aa19-421092c0b48b" path="/var/lib/kubelet/pods/e87455bb-687d-418a-aa19-421092c0b48b/volumes" Dec 01 09:23:21 crc kubenswrapper[4867]: I1201 09:23:21.601688 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:23:21 crc kubenswrapper[4867]: I1201 09:23:21.602074 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:23:27 crc kubenswrapper[4867]: I1201 09:23:27.504704 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-zvsh9" Dec 01 09:23:27 crc kubenswrapper[4867]: I1201 09:23:27.505056 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-zvsh9" Dec 01 09:23:27 crc kubenswrapper[4867]: I1201 09:23:27.531452 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-zvsh9" Dec 01 09:23:27 crc kubenswrapper[4867]: I1201 09:23:27.556339 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-zvsh9" Dec 01 09:23:27 crc kubenswrapper[4867]: I1201 09:23:27.735892 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wtm5w"] Dec 01 09:23:27 crc kubenswrapper[4867]: E1201 09:23:27.736191 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e87455bb-687d-418a-aa19-421092c0b48b" containerName="registry-server" Dec 01 09:23:27 crc kubenswrapper[4867]: I1201 09:23:27.736217 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e87455bb-687d-418a-aa19-421092c0b48b" containerName="registry-server" Dec 01 09:23:27 crc kubenswrapper[4867]: I1201 09:23:27.736343 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e87455bb-687d-418a-aa19-421092c0b48b" containerName="registry-server" Dec 01 09:23:27 crc kubenswrapper[4867]: I1201 09:23:27.737326 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wtm5w" Dec 01 09:23:27 crc kubenswrapper[4867]: I1201 09:23:27.744948 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wtm5w"] Dec 01 09:23:27 crc kubenswrapper[4867]: I1201 09:23:27.850863 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqrt6\" (UniqueName: \"kubernetes.io/projected/1ec1b9ad-4668-438b-9466-c3be6007b938-kube-api-access-gqrt6\") pod \"redhat-marketplace-wtm5w\" (UID: \"1ec1b9ad-4668-438b-9466-c3be6007b938\") " pod="openshift-marketplace/redhat-marketplace-wtm5w" Dec 01 09:23:27 crc kubenswrapper[4867]: I1201 09:23:27.850928 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ec1b9ad-4668-438b-9466-c3be6007b938-catalog-content\") pod \"redhat-marketplace-wtm5w\" (UID: \"1ec1b9ad-4668-438b-9466-c3be6007b938\") " pod="openshift-marketplace/redhat-marketplace-wtm5w" Dec 01 09:23:27 crc kubenswrapper[4867]: I1201 09:23:27.850961 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ec1b9ad-4668-438b-9466-c3be6007b938-utilities\") pod \"redhat-marketplace-wtm5w\" (UID: \"1ec1b9ad-4668-438b-9466-c3be6007b938\") " pod="openshift-marketplace/redhat-marketplace-wtm5w" Dec 01 09:23:27 crc kubenswrapper[4867]: I1201 09:23:27.951753 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqrt6\" (UniqueName: \"kubernetes.io/projected/1ec1b9ad-4668-438b-9466-c3be6007b938-kube-api-access-gqrt6\") pod \"redhat-marketplace-wtm5w\" (UID: \"1ec1b9ad-4668-438b-9466-c3be6007b938\") " pod="openshift-marketplace/redhat-marketplace-wtm5w" Dec 01 09:23:27 crc kubenswrapper[4867]: I1201 09:23:27.951819 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ec1b9ad-4668-438b-9466-c3be6007b938-catalog-content\") pod \"redhat-marketplace-wtm5w\" (UID: \"1ec1b9ad-4668-438b-9466-c3be6007b938\") " pod="openshift-marketplace/redhat-marketplace-wtm5w" Dec 01 09:23:27 crc kubenswrapper[4867]: I1201 09:23:27.951857 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ec1b9ad-4668-438b-9466-c3be6007b938-utilities\") pod \"redhat-marketplace-wtm5w\" (UID: \"1ec1b9ad-4668-438b-9466-c3be6007b938\") " pod="openshift-marketplace/redhat-marketplace-wtm5w" Dec 01 09:23:27 crc kubenswrapper[4867]: I1201 09:23:27.952457 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ec1b9ad-4668-438b-9466-c3be6007b938-catalog-content\") pod \"redhat-marketplace-wtm5w\" (UID: \"1ec1b9ad-4668-438b-9466-c3be6007b938\") " pod="openshift-marketplace/redhat-marketplace-wtm5w" Dec 01 09:23:27 crc kubenswrapper[4867]: I1201 09:23:27.952494 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ec1b9ad-4668-438b-9466-c3be6007b938-utilities\") pod \"redhat-marketplace-wtm5w\" (UID: \"1ec1b9ad-4668-438b-9466-c3be6007b938\") " pod="openshift-marketplace/redhat-marketplace-wtm5w" Dec 01 09:23:27 crc kubenswrapper[4867]: I1201 09:23:27.972162 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqrt6\" (UniqueName: \"kubernetes.io/projected/1ec1b9ad-4668-438b-9466-c3be6007b938-kube-api-access-gqrt6\") pod \"redhat-marketplace-wtm5w\" (UID: \"1ec1b9ad-4668-438b-9466-c3be6007b938\") " pod="openshift-marketplace/redhat-marketplace-wtm5w" Dec 01 09:23:28 crc kubenswrapper[4867]: I1201 09:23:28.051284 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wtm5w" Dec 01 09:23:28 crc kubenswrapper[4867]: I1201 09:23:28.056392 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-stgvd" Dec 01 09:23:28 crc kubenswrapper[4867]: I1201 09:23:28.490898 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wtm5w"] Dec 01 09:23:28 crc kubenswrapper[4867]: I1201 09:23:28.535150 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtm5w" event={"ID":"1ec1b9ad-4668-438b-9466-c3be6007b938","Type":"ContainerStarted","Data":"855659b0108cef84eb559ca8b65f622c23101a19e8ea4e1abf10e007d263f7f8"} Dec 01 09:23:29 crc kubenswrapper[4867]: I1201 09:23:29.375330 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm"] Dec 01 09:23:29 crc kubenswrapper[4867]: I1201 09:23:29.376863 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm" Dec 01 09:23:29 crc kubenswrapper[4867]: I1201 09:23:29.377591 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/438274bb-f607-4eef-af53-59566f7176d1-util\") pod \"2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm\" (UID: \"438274bb-f607-4eef-af53-59566f7176d1\") " pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm" Dec 01 09:23:29 crc kubenswrapper[4867]: I1201 09:23:29.377695 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/438274bb-f607-4eef-af53-59566f7176d1-bundle\") pod \"2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm\" (UID: \"438274bb-f607-4eef-af53-59566f7176d1\") " pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm" Dec 01 09:23:29 crc kubenswrapper[4867]: I1201 09:23:29.377733 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j2vl\" (UniqueName: \"kubernetes.io/projected/438274bb-f607-4eef-af53-59566f7176d1-kube-api-access-5j2vl\") pod \"2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm\" (UID: \"438274bb-f607-4eef-af53-59566f7176d1\") " pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm" Dec 01 09:23:29 crc kubenswrapper[4867]: I1201 09:23:29.378716 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-dz4dx" Dec 01 09:23:29 crc kubenswrapper[4867]: I1201 09:23:29.395702 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm"] Dec 01 09:23:29 crc kubenswrapper[4867]: I1201 09:23:29.481702 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/438274bb-f607-4eef-af53-59566f7176d1-bundle\") pod \"2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm\" (UID: \"438274bb-f607-4eef-af53-59566f7176d1\") " pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm" Dec 01 09:23:29 crc kubenswrapper[4867]: I1201 09:23:29.481769 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j2vl\" (UniqueName: \"kubernetes.io/projected/438274bb-f607-4eef-af53-59566f7176d1-kube-api-access-5j2vl\") pod \"2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm\" (UID: \"438274bb-f607-4eef-af53-59566f7176d1\") " pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm" Dec 01 09:23:29 crc kubenswrapper[4867]: I1201 09:23:29.481846 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/438274bb-f607-4eef-af53-59566f7176d1-util\") pod \"2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm\" (UID: \"438274bb-f607-4eef-af53-59566f7176d1\") " pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm" Dec 01 09:23:29 crc kubenswrapper[4867]: I1201 09:23:29.482314 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/438274bb-f607-4eef-af53-59566f7176d1-util\") pod \"2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm\" (UID: \"438274bb-f607-4eef-af53-59566f7176d1\") " pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm" Dec 01 09:23:29 crc kubenswrapper[4867]: I1201 09:23:29.482556 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/438274bb-f607-4eef-af53-59566f7176d1-bundle\") pod \"2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm\" (UID: \"438274bb-f607-4eef-af53-59566f7176d1\") " pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm" Dec 01 09:23:29 crc kubenswrapper[4867]: I1201 09:23:29.530038 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j2vl\" (UniqueName: \"kubernetes.io/projected/438274bb-f607-4eef-af53-59566f7176d1-kube-api-access-5j2vl\") pod \"2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm\" (UID: \"438274bb-f607-4eef-af53-59566f7176d1\") " pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm" Dec 01 09:23:29 crc kubenswrapper[4867]: I1201 09:23:29.553580 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtm5w" event={"ID":"1ec1b9ad-4668-438b-9466-c3be6007b938","Type":"ContainerStarted","Data":"7c3333519dd244ffb89051e89f2cf29a177946b11909bd00d12c5b6607ba8b60"} Dec 01 09:23:29 crc kubenswrapper[4867]: I1201 09:23:29.691716 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm" Dec 01 09:23:29 crc kubenswrapper[4867]: I1201 09:23:29.912732 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm"] Dec 01 09:23:29 crc kubenswrapper[4867]: W1201 09:23:29.918407 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod438274bb_f607_4eef_af53_59566f7176d1.slice/crio-434d4b05fdf0d1f6b02a0ba82d014dceada2c5e1306429e4bbb290409ccc8047 WatchSource:0}: Error finding container 434d4b05fdf0d1f6b02a0ba82d014dceada2c5e1306429e4bbb290409ccc8047: Status 404 returned error can't find the container with id 434d4b05fdf0d1f6b02a0ba82d014dceada2c5e1306429e4bbb290409ccc8047 Dec 01 09:23:30 crc kubenswrapper[4867]: I1201 09:23:30.560558 4867 generic.go:334] "Generic (PLEG): container finished" podID="438274bb-f607-4eef-af53-59566f7176d1" containerID="7e25148d2d829c0548786dc687dedd798bb3f99f8d0d6111f223a8f1ebb4d241" exitCode=0 Dec 01 09:23:30 crc kubenswrapper[4867]: I1201 09:23:30.560650 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm" event={"ID":"438274bb-f607-4eef-af53-59566f7176d1","Type":"ContainerDied","Data":"7e25148d2d829c0548786dc687dedd798bb3f99f8d0d6111f223a8f1ebb4d241"} Dec 01 09:23:30 crc kubenswrapper[4867]: I1201 09:23:30.560905 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm" event={"ID":"438274bb-f607-4eef-af53-59566f7176d1","Type":"ContainerStarted","Data":"434d4b05fdf0d1f6b02a0ba82d014dceada2c5e1306429e4bbb290409ccc8047"} Dec 01 09:23:30 crc kubenswrapper[4867]: I1201 09:23:30.563415 4867 generic.go:334] "Generic (PLEG): container finished" podID="1ec1b9ad-4668-438b-9466-c3be6007b938" containerID="7c3333519dd244ffb89051e89f2cf29a177946b11909bd00d12c5b6607ba8b60" exitCode=0 Dec 01 09:23:30 crc kubenswrapper[4867]: I1201 09:23:30.563454 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtm5w" event={"ID":"1ec1b9ad-4668-438b-9466-c3be6007b938","Type":"ContainerDied","Data":"7c3333519dd244ffb89051e89f2cf29a177946b11909bd00d12c5b6607ba8b60"} Dec 01 09:23:31 crc kubenswrapper[4867]: I1201 09:23:31.572058 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtm5w" event={"ID":"1ec1b9ad-4668-438b-9466-c3be6007b938","Type":"ContainerStarted","Data":"49e449a68c4a79772eafde172c7d60d7829ab4481bf31c8145bf549c6f324649"} Dec 01 09:23:31 crc kubenswrapper[4867]: I1201 09:23:31.574364 4867 generic.go:334] "Generic (PLEG): container finished" podID="438274bb-f607-4eef-af53-59566f7176d1" containerID="1dedae320244d5ab5c01ee6b7d8dac4ec761194aa08c433387577744013964d3" exitCode=0 Dec 01 09:23:31 crc kubenswrapper[4867]: I1201 09:23:31.574407 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm" event={"ID":"438274bb-f607-4eef-af53-59566f7176d1","Type":"ContainerDied","Data":"1dedae320244d5ab5c01ee6b7d8dac4ec761194aa08c433387577744013964d3"} Dec 01 09:23:32 crc kubenswrapper[4867]: I1201 09:23:32.580993 4867 generic.go:334] "Generic (PLEG): container finished" podID="1ec1b9ad-4668-438b-9466-c3be6007b938" containerID="49e449a68c4a79772eafde172c7d60d7829ab4481bf31c8145bf549c6f324649" exitCode=0 Dec 01 09:23:32 crc kubenswrapper[4867]: I1201 09:23:32.581042 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtm5w" event={"ID":"1ec1b9ad-4668-438b-9466-c3be6007b938","Type":"ContainerDied","Data":"49e449a68c4a79772eafde172c7d60d7829ab4481bf31c8145bf549c6f324649"} Dec 01 09:23:32 crc kubenswrapper[4867]: I1201 09:23:32.586108 4867 generic.go:334] "Generic (PLEG): container finished" podID="438274bb-f607-4eef-af53-59566f7176d1" containerID="4df6618534444bf76c6504419c0259d6c58219b53bf5b86c4a0a7abd226de5b4" exitCode=0 Dec 01 09:23:32 crc kubenswrapper[4867]: I1201 09:23:32.586165 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm" event={"ID":"438274bb-f607-4eef-af53-59566f7176d1","Type":"ContainerDied","Data":"4df6618534444bf76c6504419c0259d6c58219b53bf5b86c4a0a7abd226de5b4"} Dec 01 09:23:33 crc kubenswrapper[4867]: I1201 09:23:33.595181 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtm5w" event={"ID":"1ec1b9ad-4668-438b-9466-c3be6007b938","Type":"ContainerStarted","Data":"374ff30e2323fd02a1ccd54458943cc9865941d167a57f860ab6861b6e103ac8"} Dec 01 09:23:33 crc kubenswrapper[4867]: I1201 09:23:33.618655 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wtm5w" podStartSLOduration=3.830747467 podStartE2EDuration="6.618636708s" podCreationTimestamp="2025-12-01 09:23:27 +0000 UTC" firstStartedPulling="2025-12-01 09:23:30.564907475 +0000 UTC m=+932.024294229" lastFinishedPulling="2025-12-01 09:23:33.352796716 +0000 UTC m=+934.812183470" observedRunningTime="2025-12-01 09:23:33.614572546 +0000 UTC m=+935.073959300" watchObservedRunningTime="2025-12-01 09:23:33.618636708 +0000 UTC m=+935.078023462" Dec 01 09:23:33 crc kubenswrapper[4867]: I1201 09:23:33.831183 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm" Dec 01 09:23:33 crc kubenswrapper[4867]: I1201 09:23:33.835206 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j2vl\" (UniqueName: \"kubernetes.io/projected/438274bb-f607-4eef-af53-59566f7176d1-kube-api-access-5j2vl\") pod \"438274bb-f607-4eef-af53-59566f7176d1\" (UID: \"438274bb-f607-4eef-af53-59566f7176d1\") " Dec 01 09:23:33 crc kubenswrapper[4867]: I1201 09:23:33.856628 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/438274bb-f607-4eef-af53-59566f7176d1-kube-api-access-5j2vl" (OuterVolumeSpecName: "kube-api-access-5j2vl") pod "438274bb-f607-4eef-af53-59566f7176d1" (UID: "438274bb-f607-4eef-af53-59566f7176d1"). InnerVolumeSpecName "kube-api-access-5j2vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:23:33 crc kubenswrapper[4867]: I1201 09:23:33.935935 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/438274bb-f607-4eef-af53-59566f7176d1-util\") pod \"438274bb-f607-4eef-af53-59566f7176d1\" (UID: \"438274bb-f607-4eef-af53-59566f7176d1\") " Dec 01 09:23:33 crc kubenswrapper[4867]: I1201 09:23:33.936031 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/438274bb-f607-4eef-af53-59566f7176d1-bundle\") pod \"438274bb-f607-4eef-af53-59566f7176d1\" (UID: \"438274bb-f607-4eef-af53-59566f7176d1\") " Dec 01 09:23:33 crc kubenswrapper[4867]: I1201 09:23:33.936300 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j2vl\" (UniqueName: \"kubernetes.io/projected/438274bb-f607-4eef-af53-59566f7176d1-kube-api-access-5j2vl\") on node \"crc\" DevicePath \"\"" Dec 01 09:23:33 crc kubenswrapper[4867]: I1201 09:23:33.937004 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/438274bb-f607-4eef-af53-59566f7176d1-bundle" (OuterVolumeSpecName: "bundle") pod "438274bb-f607-4eef-af53-59566f7176d1" (UID: "438274bb-f607-4eef-af53-59566f7176d1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:23:33 crc kubenswrapper[4867]: I1201 09:23:33.955036 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/438274bb-f607-4eef-af53-59566f7176d1-util" (OuterVolumeSpecName: "util") pod "438274bb-f607-4eef-af53-59566f7176d1" (UID: "438274bb-f607-4eef-af53-59566f7176d1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:23:34 crc kubenswrapper[4867]: I1201 09:23:34.037123 4867 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/438274bb-f607-4eef-af53-59566f7176d1-util\") on node \"crc\" DevicePath \"\"" Dec 01 09:23:34 crc kubenswrapper[4867]: I1201 09:23:34.037166 4867 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/438274bb-f607-4eef-af53-59566f7176d1-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:23:34 crc kubenswrapper[4867]: I1201 09:23:34.602880 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm" event={"ID":"438274bb-f607-4eef-af53-59566f7176d1","Type":"ContainerDied","Data":"434d4b05fdf0d1f6b02a0ba82d014dceada2c5e1306429e4bbb290409ccc8047"} Dec 01 09:23:34 crc kubenswrapper[4867]: I1201 09:23:34.602917 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="434d4b05fdf0d1f6b02a0ba82d014dceada2c5e1306429e4bbb290409ccc8047" Dec 01 09:23:34 crc kubenswrapper[4867]: I1201 09:23:34.603613 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm" Dec 01 09:23:36 crc kubenswrapper[4867]: I1201 09:23:36.960389 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-66fc949795-bpdpc"] Dec 01 09:23:36 crc kubenswrapper[4867]: E1201 09:23:36.961250 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438274bb-f607-4eef-af53-59566f7176d1" containerName="extract" Dec 01 09:23:36 crc kubenswrapper[4867]: I1201 09:23:36.961265 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="438274bb-f607-4eef-af53-59566f7176d1" containerName="extract" Dec 01 09:23:36 crc kubenswrapper[4867]: E1201 09:23:36.961282 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438274bb-f607-4eef-af53-59566f7176d1" containerName="util" Dec 01 09:23:36 crc kubenswrapper[4867]: I1201 09:23:36.961291 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="438274bb-f607-4eef-af53-59566f7176d1" containerName="util" Dec 01 09:23:36 crc kubenswrapper[4867]: E1201 09:23:36.961302 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438274bb-f607-4eef-af53-59566f7176d1" containerName="pull" Dec 01 09:23:36 crc kubenswrapper[4867]: I1201 09:23:36.961310 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="438274bb-f607-4eef-af53-59566f7176d1" containerName="pull" Dec 01 09:23:36 crc kubenswrapper[4867]: I1201 09:23:36.961480 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="438274bb-f607-4eef-af53-59566f7176d1" containerName="extract" Dec 01 09:23:36 crc kubenswrapper[4867]: I1201 09:23:36.962127 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-66fc949795-bpdpc" Dec 01 09:23:36 crc kubenswrapper[4867]: I1201 09:23:36.964868 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-nfkg2" Dec 01 09:23:37 crc kubenswrapper[4867]: I1201 09:23:37.018849 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-66fc949795-bpdpc"] Dec 01 09:23:37 crc kubenswrapper[4867]: I1201 09:23:37.073848 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v89l\" (UniqueName: \"kubernetes.io/projected/8072c3c3-367c-47af-952b-f303a97d1afe-kube-api-access-7v89l\") pod \"openstack-operator-controller-operator-66fc949795-bpdpc\" (UID: \"8072c3c3-367c-47af-952b-f303a97d1afe\") " pod="openstack-operators/openstack-operator-controller-operator-66fc949795-bpdpc" Dec 01 09:23:37 crc kubenswrapper[4867]: I1201 09:23:37.175518 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v89l\" (UniqueName: \"kubernetes.io/projected/8072c3c3-367c-47af-952b-f303a97d1afe-kube-api-access-7v89l\") pod \"openstack-operator-controller-operator-66fc949795-bpdpc\" (UID: \"8072c3c3-367c-47af-952b-f303a97d1afe\") " pod="openstack-operators/openstack-operator-controller-operator-66fc949795-bpdpc" Dec 01 09:23:37 crc kubenswrapper[4867]: I1201 09:23:37.195598 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v89l\" (UniqueName: \"kubernetes.io/projected/8072c3c3-367c-47af-952b-f303a97d1afe-kube-api-access-7v89l\") pod \"openstack-operator-controller-operator-66fc949795-bpdpc\" (UID: \"8072c3c3-367c-47af-952b-f303a97d1afe\") " pod="openstack-operators/openstack-operator-controller-operator-66fc949795-bpdpc" Dec 01 09:23:37 crc kubenswrapper[4867]: I1201 09:23:37.279955 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-66fc949795-bpdpc" Dec 01 09:23:37 crc kubenswrapper[4867]: I1201 09:23:37.606780 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-66fc949795-bpdpc"] Dec 01 09:23:37 crc kubenswrapper[4867]: I1201 09:23:37.622368 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-66fc949795-bpdpc" event={"ID":"8072c3c3-367c-47af-952b-f303a97d1afe","Type":"ContainerStarted","Data":"ec0a6f91a06688cd6af3c2344353d13d1c0e661d965051ed86aa8ff12735bb75"} Dec 01 09:23:38 crc kubenswrapper[4867]: I1201 09:23:38.051332 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wtm5w" Dec 01 09:23:38 crc kubenswrapper[4867]: I1201 09:23:38.052189 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wtm5w" Dec 01 09:23:38 crc kubenswrapper[4867]: I1201 09:23:38.103413 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wtm5w" Dec 01 09:23:38 crc kubenswrapper[4867]: I1201 09:23:38.679452 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wtm5w" Dec 01 09:23:40 crc kubenswrapper[4867]: I1201 09:23:40.534243 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wtm5w"] Dec 01 09:23:41 crc kubenswrapper[4867]: I1201 09:23:41.660412 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wtm5w" podUID="1ec1b9ad-4668-438b-9466-c3be6007b938" containerName="registry-server" containerID="cri-o://374ff30e2323fd02a1ccd54458943cc9865941d167a57f860ab6861b6e103ac8" gracePeriod=2 Dec 01 09:23:42 crc kubenswrapper[4867]: I1201 09:23:42.673185 4867 generic.go:334] "Generic (PLEG): container finished" podID="1ec1b9ad-4668-438b-9466-c3be6007b938" containerID="374ff30e2323fd02a1ccd54458943cc9865941d167a57f860ab6861b6e103ac8" exitCode=0 Dec 01 09:23:42 crc kubenswrapper[4867]: I1201 09:23:42.673383 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtm5w" event={"ID":"1ec1b9ad-4668-438b-9466-c3be6007b938","Type":"ContainerDied","Data":"374ff30e2323fd02a1ccd54458943cc9865941d167a57f860ab6861b6e103ac8"} Dec 01 09:23:47 crc kubenswrapper[4867]: I1201 09:23:47.979272 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wtm5w" Dec 01 09:23:48 crc kubenswrapper[4867]: I1201 09:23:48.163420 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ec1b9ad-4668-438b-9466-c3be6007b938-catalog-content\") pod \"1ec1b9ad-4668-438b-9466-c3be6007b938\" (UID: \"1ec1b9ad-4668-438b-9466-c3be6007b938\") " Dec 01 09:23:48 crc kubenswrapper[4867]: I1201 09:23:48.163515 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqrt6\" (UniqueName: \"kubernetes.io/projected/1ec1b9ad-4668-438b-9466-c3be6007b938-kube-api-access-gqrt6\") pod \"1ec1b9ad-4668-438b-9466-c3be6007b938\" (UID: \"1ec1b9ad-4668-438b-9466-c3be6007b938\") " Dec 01 09:23:48 crc kubenswrapper[4867]: I1201 09:23:48.163601 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ec1b9ad-4668-438b-9466-c3be6007b938-utilities\") pod \"1ec1b9ad-4668-438b-9466-c3be6007b938\" (UID: \"1ec1b9ad-4668-438b-9466-c3be6007b938\") " Dec 01 09:23:48 crc kubenswrapper[4867]: I1201 09:23:48.164424 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ec1b9ad-4668-438b-9466-c3be6007b938-utilities" (OuterVolumeSpecName: "utilities") pod "1ec1b9ad-4668-438b-9466-c3be6007b938" (UID: "1ec1b9ad-4668-438b-9466-c3be6007b938"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:23:48 crc kubenswrapper[4867]: I1201 09:23:48.169768 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ec1b9ad-4668-438b-9466-c3be6007b938-kube-api-access-gqrt6" (OuterVolumeSpecName: "kube-api-access-gqrt6") pod "1ec1b9ad-4668-438b-9466-c3be6007b938" (UID: "1ec1b9ad-4668-438b-9466-c3be6007b938"). InnerVolumeSpecName "kube-api-access-gqrt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:23:48 crc kubenswrapper[4867]: I1201 09:23:48.188301 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ec1b9ad-4668-438b-9466-c3be6007b938-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ec1b9ad-4668-438b-9466-c3be6007b938" (UID: "1ec1b9ad-4668-438b-9466-c3be6007b938"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:23:48 crc kubenswrapper[4867]: I1201 09:23:48.265596 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ec1b9ad-4668-438b-9466-c3be6007b938-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:23:48 crc kubenswrapper[4867]: I1201 09:23:48.265636 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ec1b9ad-4668-438b-9466-c3be6007b938-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:23:48 crc kubenswrapper[4867]: I1201 09:23:48.265651 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqrt6\" (UniqueName: \"kubernetes.io/projected/1ec1b9ad-4668-438b-9466-c3be6007b938-kube-api-access-gqrt6\") on node \"crc\" DevicePath \"\"" Dec 01 09:23:48 crc kubenswrapper[4867]: I1201 09:23:48.711350 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtm5w" event={"ID":"1ec1b9ad-4668-438b-9466-c3be6007b938","Type":"ContainerDied","Data":"855659b0108cef84eb559ca8b65f622c23101a19e8ea4e1abf10e007d263f7f8"} Dec 01 09:23:48 crc kubenswrapper[4867]: I1201 09:23:48.711434 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wtm5w" Dec 01 09:23:48 crc kubenswrapper[4867]: I1201 09:23:48.711771 4867 scope.go:117] "RemoveContainer" containerID="374ff30e2323fd02a1ccd54458943cc9865941d167a57f860ab6861b6e103ac8" Dec 01 09:23:48 crc kubenswrapper[4867]: I1201 09:23:48.790603 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wtm5w"] Dec 01 09:23:48 crc kubenswrapper[4867]: I1201 09:23:48.798526 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wtm5w"] Dec 01 09:23:48 crc kubenswrapper[4867]: I1201 09:23:48.821899 4867 scope.go:117] "RemoveContainer" containerID="49e449a68c4a79772eafde172c7d60d7829ab4481bf31c8145bf549c6f324649" Dec 01 09:23:48 crc kubenswrapper[4867]: I1201 09:23:48.835140 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ec1b9ad-4668-438b-9466-c3be6007b938" path="/var/lib/kubelet/pods/1ec1b9ad-4668-438b-9466-c3be6007b938/volumes" Dec 01 09:23:48 crc kubenswrapper[4867]: I1201 09:23:48.846394 4867 scope.go:117] "RemoveContainer" containerID="7c3333519dd244ffb89051e89f2cf29a177946b11909bd00d12c5b6607ba8b60" Dec 01 09:23:49 crc kubenswrapper[4867]: I1201 09:23:49.719892 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-66fc949795-bpdpc" event={"ID":"8072c3c3-367c-47af-952b-f303a97d1afe","Type":"ContainerStarted","Data":"28e0f9bcc8f4a5bd4f330e871c2fd451400dba36726646e505cee71f420d4209"} Dec 01 09:23:49 crc kubenswrapper[4867]: I1201 09:23:49.719963 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-66fc949795-bpdpc" Dec 01 09:23:49 crc kubenswrapper[4867]: I1201 09:23:49.764141 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-66fc949795-bpdpc" podStartSLOduration=2.349072553 podStartE2EDuration="13.764117443s" podCreationTimestamp="2025-12-01 09:23:36 +0000 UTC" firstStartedPulling="2025-12-01 09:23:37.618599433 +0000 UTC m=+939.077986187" lastFinishedPulling="2025-12-01 09:23:49.033644323 +0000 UTC m=+950.493031077" observedRunningTime="2025-12-01 09:23:49.757336678 +0000 UTC m=+951.216723432" watchObservedRunningTime="2025-12-01 09:23:49.764117443 +0000 UTC m=+951.223504197" Dec 01 09:23:51 crc kubenswrapper[4867]: I1201 09:23:51.601361 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:23:51 crc kubenswrapper[4867]: I1201 09:23:51.601453 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:23:51 crc kubenswrapper[4867]: I1201 09:23:51.601528 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:23:51 crc kubenswrapper[4867]: I1201 09:23:51.602502 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"793e6afb196d113ee707de55107187443477607da81a9336611cc7c60ae9f91b"} pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:23:51 crc kubenswrapper[4867]: I1201 09:23:51.602614 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" containerID="cri-o://793e6afb196d113ee707de55107187443477607da81a9336611cc7c60ae9f91b" gracePeriod=600 Dec 01 09:23:51 crc kubenswrapper[4867]: I1201 09:23:51.734760 4867 generic.go:334] "Generic (PLEG): container finished" podID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerID="793e6afb196d113ee707de55107187443477607da81a9336611cc7c60ae9f91b" exitCode=0 Dec 01 09:23:51 crc kubenswrapper[4867]: I1201 09:23:51.734805 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerDied","Data":"793e6afb196d113ee707de55107187443477607da81a9336611cc7c60ae9f91b"} Dec 01 09:23:51 crc kubenswrapper[4867]: I1201 09:23:51.734860 4867 scope.go:117] "RemoveContainer" containerID="fa8ac94dfac3773a1b35360216b60b2041c8fba117bde3b6f4dcf7bb4fc033b2" Dec 01 09:23:52 crc kubenswrapper[4867]: I1201 09:23:52.647480 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2gj8n"] Dec 01 09:23:52 crc kubenswrapper[4867]: E1201 09:23:52.648278 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec1b9ad-4668-438b-9466-c3be6007b938" containerName="extract-utilities" Dec 01 09:23:52 crc kubenswrapper[4867]: I1201 09:23:52.648294 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec1b9ad-4668-438b-9466-c3be6007b938" containerName="extract-utilities" Dec 01 09:23:52 crc kubenswrapper[4867]: E1201 09:23:52.648309 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec1b9ad-4668-438b-9466-c3be6007b938" containerName="registry-server" Dec 01 09:23:52 crc kubenswrapper[4867]: I1201 09:23:52.648316 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec1b9ad-4668-438b-9466-c3be6007b938" containerName="registry-server" Dec 01 09:23:52 crc kubenswrapper[4867]: E1201 09:23:52.648329 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec1b9ad-4668-438b-9466-c3be6007b938" containerName="extract-content" Dec 01 09:23:52 crc kubenswrapper[4867]: I1201 09:23:52.648338 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec1b9ad-4668-438b-9466-c3be6007b938" containerName="extract-content" Dec 01 09:23:52 crc kubenswrapper[4867]: I1201 09:23:52.648454 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ec1b9ad-4668-438b-9466-c3be6007b938" containerName="registry-server" Dec 01 09:23:52 crc kubenswrapper[4867]: I1201 09:23:52.649478 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2gj8n" Dec 01 09:23:52 crc kubenswrapper[4867]: I1201 09:23:52.657646 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2gj8n"] Dec 01 09:23:52 crc kubenswrapper[4867]: I1201 09:23:52.743248 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"3efb00c27c0eaaf97b5cf3c44be1e5a5598923c3a199804003a6c5848c9f9cea"} Dec 01 09:23:52 crc kubenswrapper[4867]: I1201 09:23:52.830741 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d045030-b6de-435a-9220-69aeadef1746-catalog-content\") pod \"certified-operators-2gj8n\" (UID: \"9d045030-b6de-435a-9220-69aeadef1746\") " pod="openshift-marketplace/certified-operators-2gj8n" Dec 01 09:23:52 crc kubenswrapper[4867]: I1201 09:23:52.830873 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d045030-b6de-435a-9220-69aeadef1746-utilities\") pod \"certified-operators-2gj8n\" (UID: \"9d045030-b6de-435a-9220-69aeadef1746\") " pod="openshift-marketplace/certified-operators-2gj8n" Dec 01 09:23:52 crc kubenswrapper[4867]: I1201 09:23:52.830902 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrfdq\" (UniqueName: \"kubernetes.io/projected/9d045030-b6de-435a-9220-69aeadef1746-kube-api-access-zrfdq\") pod \"certified-operators-2gj8n\" (UID: \"9d045030-b6de-435a-9220-69aeadef1746\") " pod="openshift-marketplace/certified-operators-2gj8n" Dec 01 09:23:52 crc kubenswrapper[4867]: I1201 09:23:52.931916 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d045030-b6de-435a-9220-69aeadef1746-catalog-content\") pod \"certified-operators-2gj8n\" (UID: \"9d045030-b6de-435a-9220-69aeadef1746\") " pod="openshift-marketplace/certified-operators-2gj8n" Dec 01 09:23:52 crc kubenswrapper[4867]: I1201 09:23:52.932017 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d045030-b6de-435a-9220-69aeadef1746-utilities\") pod \"certified-operators-2gj8n\" (UID: \"9d045030-b6de-435a-9220-69aeadef1746\") " pod="openshift-marketplace/certified-operators-2gj8n" Dec 01 09:23:52 crc kubenswrapper[4867]: I1201 09:23:52.932053 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrfdq\" (UniqueName: \"kubernetes.io/projected/9d045030-b6de-435a-9220-69aeadef1746-kube-api-access-zrfdq\") pod \"certified-operators-2gj8n\" (UID: \"9d045030-b6de-435a-9220-69aeadef1746\") " pod="openshift-marketplace/certified-operators-2gj8n" Dec 01 09:23:52 crc kubenswrapper[4867]: I1201 09:23:52.932859 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d045030-b6de-435a-9220-69aeadef1746-catalog-content\") pod \"certified-operators-2gj8n\" (UID: \"9d045030-b6de-435a-9220-69aeadef1746\") " pod="openshift-marketplace/certified-operators-2gj8n" Dec 01 09:23:52 crc kubenswrapper[4867]: I1201 09:23:52.933120 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d045030-b6de-435a-9220-69aeadef1746-utilities\") pod \"certified-operators-2gj8n\" (UID: \"9d045030-b6de-435a-9220-69aeadef1746\") " pod="openshift-marketplace/certified-operators-2gj8n" Dec 01 09:23:52 crc kubenswrapper[4867]: I1201 09:23:52.959530 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrfdq\" (UniqueName: \"kubernetes.io/projected/9d045030-b6de-435a-9220-69aeadef1746-kube-api-access-zrfdq\") pod \"certified-operators-2gj8n\" (UID: \"9d045030-b6de-435a-9220-69aeadef1746\") " pod="openshift-marketplace/certified-operators-2gj8n" Dec 01 09:23:52 crc kubenswrapper[4867]: I1201 09:23:52.966194 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2gj8n" Dec 01 09:23:53 crc kubenswrapper[4867]: I1201 09:23:53.273481 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2gj8n"] Dec 01 09:23:53 crc kubenswrapper[4867]: I1201 09:23:53.749988 4867 generic.go:334] "Generic (PLEG): container finished" podID="9d045030-b6de-435a-9220-69aeadef1746" containerID="dd958687e3fdcd98d0b880f4b88c55037754301eb590dfcc845d15fc28c31c96" exitCode=0 Dec 01 09:23:53 crc kubenswrapper[4867]: I1201 09:23:53.750038 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2gj8n" event={"ID":"9d045030-b6de-435a-9220-69aeadef1746","Type":"ContainerDied","Data":"dd958687e3fdcd98d0b880f4b88c55037754301eb590dfcc845d15fc28c31c96"} Dec 01 09:23:53 crc kubenswrapper[4867]: I1201 09:23:53.750291 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2gj8n" event={"ID":"9d045030-b6de-435a-9220-69aeadef1746","Type":"ContainerStarted","Data":"dea4cca215b2fa74df92bea5ccdefe3c8d3d5cd5556547c94650ce85550c3644"} Dec 01 09:23:54 crc kubenswrapper[4867]: I1201 09:23:54.766091 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2gj8n" event={"ID":"9d045030-b6de-435a-9220-69aeadef1746","Type":"ContainerStarted","Data":"9b547a359d6c370d7f4b0e50533053d4128cb8e2fca29a4b91dc66f4e835c35b"} Dec 01 09:23:55 crc kubenswrapper[4867]: I1201 09:23:55.777090 4867 generic.go:334] "Generic (PLEG): container finished" podID="9d045030-b6de-435a-9220-69aeadef1746" containerID="9b547a359d6c370d7f4b0e50533053d4128cb8e2fca29a4b91dc66f4e835c35b" exitCode=0 Dec 01 09:23:55 crc kubenswrapper[4867]: I1201 09:23:55.777232 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2gj8n" event={"ID":"9d045030-b6de-435a-9220-69aeadef1746","Type":"ContainerDied","Data":"9b547a359d6c370d7f4b0e50533053d4128cb8e2fca29a4b91dc66f4e835c35b"} Dec 01 09:23:57 crc kubenswrapper[4867]: I1201 09:23:57.282290 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-66fc949795-bpdpc" Dec 01 09:23:57 crc kubenswrapper[4867]: I1201 09:23:57.798215 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2gj8n" event={"ID":"9d045030-b6de-435a-9220-69aeadef1746","Type":"ContainerStarted","Data":"f2ee8bd04d6cfc51e8b55f2595697dc71c3cb4e8ba91e4b8ad5ce2ef3f3753cd"} Dec 01 09:23:57 crc kubenswrapper[4867]: I1201 09:23:57.825385 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2gj8n" podStartSLOduration=2.829831753 podStartE2EDuration="5.825367443s" podCreationTimestamp="2025-12-01 09:23:52 +0000 UTC" firstStartedPulling="2025-12-01 09:23:53.751066741 +0000 UTC m=+955.210453495" lastFinishedPulling="2025-12-01 09:23:56.746602431 +0000 UTC m=+958.205989185" observedRunningTime="2025-12-01 09:23:57.819954995 +0000 UTC m=+959.279341769" watchObservedRunningTime="2025-12-01 09:23:57.825367443 +0000 UTC m=+959.284754197" Dec 01 09:23:58 crc kubenswrapper[4867]: I1201 09:23:58.052568 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-22kcv"] Dec 01 09:23:58 crc kubenswrapper[4867]: I1201 09:23:58.054040 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-22kcv" Dec 01 09:23:58 crc kubenswrapper[4867]: I1201 09:23:58.073306 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-22kcv"] Dec 01 09:23:58 crc kubenswrapper[4867]: I1201 09:23:58.127488 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7pvv\" (UniqueName: \"kubernetes.io/projected/b12688cd-6bde-4fda-9453-18e0238eb201-kube-api-access-c7pvv\") pod \"community-operators-22kcv\" (UID: \"b12688cd-6bde-4fda-9453-18e0238eb201\") " pod="openshift-marketplace/community-operators-22kcv" Dec 01 09:23:58 crc kubenswrapper[4867]: I1201 09:23:58.127564 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b12688cd-6bde-4fda-9453-18e0238eb201-catalog-content\") pod \"community-operators-22kcv\" (UID: \"b12688cd-6bde-4fda-9453-18e0238eb201\") " pod="openshift-marketplace/community-operators-22kcv" Dec 01 09:23:58 crc kubenswrapper[4867]: I1201 09:23:58.127607 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b12688cd-6bde-4fda-9453-18e0238eb201-utilities\") pod \"community-operators-22kcv\" (UID: \"b12688cd-6bde-4fda-9453-18e0238eb201\") " pod="openshift-marketplace/community-operators-22kcv" Dec 01 09:23:58 crc kubenswrapper[4867]: I1201 09:23:58.408083 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7pvv\" (UniqueName: \"kubernetes.io/projected/b12688cd-6bde-4fda-9453-18e0238eb201-kube-api-access-c7pvv\") pod \"community-operators-22kcv\" (UID: \"b12688cd-6bde-4fda-9453-18e0238eb201\") " pod="openshift-marketplace/community-operators-22kcv" Dec 01 09:23:58 crc kubenswrapper[4867]: I1201 09:23:58.408421 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b12688cd-6bde-4fda-9453-18e0238eb201-catalog-content\") pod \"community-operators-22kcv\" (UID: \"b12688cd-6bde-4fda-9453-18e0238eb201\") " pod="openshift-marketplace/community-operators-22kcv" Dec 01 09:23:58 crc kubenswrapper[4867]: I1201 09:23:58.408468 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b12688cd-6bde-4fda-9453-18e0238eb201-utilities\") pod \"community-operators-22kcv\" (UID: \"b12688cd-6bde-4fda-9453-18e0238eb201\") " pod="openshift-marketplace/community-operators-22kcv" Dec 01 09:23:58 crc kubenswrapper[4867]: I1201 09:23:58.409091 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b12688cd-6bde-4fda-9453-18e0238eb201-utilities\") pod \"community-operators-22kcv\" (UID: \"b12688cd-6bde-4fda-9453-18e0238eb201\") " pod="openshift-marketplace/community-operators-22kcv" Dec 01 09:23:58 crc kubenswrapper[4867]: I1201 09:23:58.409190 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b12688cd-6bde-4fda-9453-18e0238eb201-catalog-content\") pod \"community-operators-22kcv\" (UID: \"b12688cd-6bde-4fda-9453-18e0238eb201\") " pod="openshift-marketplace/community-operators-22kcv" Dec 01 09:23:58 crc kubenswrapper[4867]: I1201 09:23:58.462918 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7pvv\" (UniqueName: \"kubernetes.io/projected/b12688cd-6bde-4fda-9453-18e0238eb201-kube-api-access-c7pvv\") pod \"community-operators-22kcv\" (UID: \"b12688cd-6bde-4fda-9453-18e0238eb201\") " pod="openshift-marketplace/community-operators-22kcv" Dec 01 09:23:58 crc kubenswrapper[4867]: I1201 09:23:58.671789 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-22kcv" Dec 01 09:23:59 crc kubenswrapper[4867]: I1201 09:23:59.144995 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-22kcv"] Dec 01 09:23:59 crc kubenswrapper[4867]: W1201 09:23:59.160336 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb12688cd_6bde_4fda_9453_18e0238eb201.slice/crio-2ead05c078b5017297f5bb1335e059d257ea668fd4f1cdfdef4f62d5750abb7a WatchSource:0}: Error finding container 2ead05c078b5017297f5bb1335e059d257ea668fd4f1cdfdef4f62d5750abb7a: Status 404 returned error can't find the container with id 2ead05c078b5017297f5bb1335e059d257ea668fd4f1cdfdef4f62d5750abb7a Dec 01 09:23:59 crc kubenswrapper[4867]: I1201 09:23:59.812063 4867 generic.go:334] "Generic (PLEG): container finished" podID="b12688cd-6bde-4fda-9453-18e0238eb201" containerID="da0e86917115345a2f7d9b10b9698b90e812ce5accbfdba745e394f012598112" exitCode=0 Dec 01 09:23:59 crc kubenswrapper[4867]: I1201 09:23:59.812146 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22kcv" event={"ID":"b12688cd-6bde-4fda-9453-18e0238eb201","Type":"ContainerDied","Data":"da0e86917115345a2f7d9b10b9698b90e812ce5accbfdba745e394f012598112"} Dec 01 09:23:59 crc kubenswrapper[4867]: I1201 09:23:59.812177 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22kcv" event={"ID":"b12688cd-6bde-4fda-9453-18e0238eb201","Type":"ContainerStarted","Data":"2ead05c078b5017297f5bb1335e059d257ea668fd4f1cdfdef4f62d5750abb7a"} Dec 01 09:24:02 crc kubenswrapper[4867]: I1201 09:24:02.967093 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2gj8n" Dec 01 09:24:02 crc kubenswrapper[4867]: I1201 09:24:02.967776 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2gj8n" Dec 01 09:24:03 crc kubenswrapper[4867]: I1201 09:24:03.090060 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2gj8n" Dec 01 09:24:04 crc kubenswrapper[4867]: I1201 09:24:04.009198 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2gj8n" Dec 01 09:24:05 crc kubenswrapper[4867]: I1201 09:24:05.242944 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2gj8n"] Dec 01 09:24:05 crc kubenswrapper[4867]: I1201 09:24:05.856585 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2gj8n" podUID="9d045030-b6de-435a-9220-69aeadef1746" containerName="registry-server" containerID="cri-o://f2ee8bd04d6cfc51e8b55f2595697dc71c3cb4e8ba91e4b8ad5ce2ef3f3753cd" gracePeriod=2 Dec 01 09:24:07 crc kubenswrapper[4867]: I1201 09:24:07.873552 4867 generic.go:334] "Generic (PLEG): container finished" podID="9d045030-b6de-435a-9220-69aeadef1746" containerID="f2ee8bd04d6cfc51e8b55f2595697dc71c3cb4e8ba91e4b8ad5ce2ef3f3753cd" exitCode=0 Dec 01 09:24:07 crc kubenswrapper[4867]: I1201 09:24:07.873624 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2gj8n" event={"ID":"9d045030-b6de-435a-9220-69aeadef1746","Type":"ContainerDied","Data":"f2ee8bd04d6cfc51e8b55f2595697dc71c3cb4e8ba91e4b8ad5ce2ef3f3753cd"} Dec 01 09:24:08 crc kubenswrapper[4867]: I1201 09:24:08.485512 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2gj8n" Dec 01 09:24:08 crc kubenswrapper[4867]: I1201 09:24:08.669059 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrfdq\" (UniqueName: \"kubernetes.io/projected/9d045030-b6de-435a-9220-69aeadef1746-kube-api-access-zrfdq\") pod \"9d045030-b6de-435a-9220-69aeadef1746\" (UID: \"9d045030-b6de-435a-9220-69aeadef1746\") " Dec 01 09:24:08 crc kubenswrapper[4867]: I1201 09:24:08.669464 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d045030-b6de-435a-9220-69aeadef1746-catalog-content\") pod \"9d045030-b6de-435a-9220-69aeadef1746\" (UID: \"9d045030-b6de-435a-9220-69aeadef1746\") " Dec 01 09:24:08 crc kubenswrapper[4867]: I1201 09:24:08.669585 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d045030-b6de-435a-9220-69aeadef1746-utilities\") pod \"9d045030-b6de-435a-9220-69aeadef1746\" (UID: \"9d045030-b6de-435a-9220-69aeadef1746\") " Dec 01 09:24:08 crc kubenswrapper[4867]: I1201 09:24:08.670323 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d045030-b6de-435a-9220-69aeadef1746-utilities" (OuterVolumeSpecName: "utilities") pod "9d045030-b6de-435a-9220-69aeadef1746" (UID: "9d045030-b6de-435a-9220-69aeadef1746"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:24:08 crc kubenswrapper[4867]: I1201 09:24:08.675004 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d045030-b6de-435a-9220-69aeadef1746-kube-api-access-zrfdq" (OuterVolumeSpecName: "kube-api-access-zrfdq") pod "9d045030-b6de-435a-9220-69aeadef1746" (UID: "9d045030-b6de-435a-9220-69aeadef1746"). InnerVolumeSpecName "kube-api-access-zrfdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:24:08 crc kubenswrapper[4867]: I1201 09:24:08.715387 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d045030-b6de-435a-9220-69aeadef1746-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d045030-b6de-435a-9220-69aeadef1746" (UID: "9d045030-b6de-435a-9220-69aeadef1746"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:24:08 crc kubenswrapper[4867]: I1201 09:24:08.770905 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d045030-b6de-435a-9220-69aeadef1746-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:24:08 crc kubenswrapper[4867]: I1201 09:24:08.770938 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrfdq\" (UniqueName: \"kubernetes.io/projected/9d045030-b6de-435a-9220-69aeadef1746-kube-api-access-zrfdq\") on node \"crc\" DevicePath \"\"" Dec 01 09:24:08 crc kubenswrapper[4867]: I1201 09:24:08.770949 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d045030-b6de-435a-9220-69aeadef1746-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:24:08 crc kubenswrapper[4867]: I1201 09:24:08.880153 4867 generic.go:334] "Generic (PLEG): container finished" podID="b12688cd-6bde-4fda-9453-18e0238eb201" containerID="7a02d063492adc4f9abecaf9587509ccc9f296ea25d38d8c608666d5f96f8de3" exitCode=0 Dec 01 09:24:08 crc kubenswrapper[4867]: I1201 09:24:08.880228 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22kcv" event={"ID":"b12688cd-6bde-4fda-9453-18e0238eb201","Type":"ContainerDied","Data":"7a02d063492adc4f9abecaf9587509ccc9f296ea25d38d8c608666d5f96f8de3"} Dec 01 09:24:08 crc kubenswrapper[4867]: I1201 09:24:08.881988 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2gj8n" event={"ID":"9d045030-b6de-435a-9220-69aeadef1746","Type":"ContainerDied","Data":"dea4cca215b2fa74df92bea5ccdefe3c8d3d5cd5556547c94650ce85550c3644"} Dec 01 09:24:08 crc kubenswrapper[4867]: I1201 09:24:08.882023 4867 scope.go:117] "RemoveContainer" containerID="f2ee8bd04d6cfc51e8b55f2595697dc71c3cb4e8ba91e4b8ad5ce2ef3f3753cd" Dec 01 09:24:08 crc kubenswrapper[4867]: I1201 09:24:08.882221 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2gj8n" Dec 01 09:24:08 crc kubenswrapper[4867]: I1201 09:24:08.896255 4867 scope.go:117] "RemoveContainer" containerID="9b547a359d6c370d7f4b0e50533053d4128cb8e2fca29a4b91dc66f4e835c35b" Dec 01 09:24:08 crc kubenswrapper[4867]: I1201 09:24:08.913661 4867 scope.go:117] "RemoveContainer" containerID="dd958687e3fdcd98d0b880f4b88c55037754301eb590dfcc845d15fc28c31c96" Dec 01 09:24:08 crc kubenswrapper[4867]: I1201 09:24:08.959668 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2gj8n"] Dec 01 09:24:08 crc kubenswrapper[4867]: I1201 09:24:08.971869 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2gj8n"] Dec 01 09:24:09 crc kubenswrapper[4867]: I1201 09:24:09.890911 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22kcv" event={"ID":"b12688cd-6bde-4fda-9453-18e0238eb201","Type":"ContainerStarted","Data":"11e142b10d578f605c76d943d8d44b066db7f09a2e3b2aa8356a57357db80506"} Dec 01 09:24:10 crc kubenswrapper[4867]: I1201 09:24:10.835651 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d045030-b6de-435a-9220-69aeadef1746" path="/var/lib/kubelet/pods/9d045030-b6de-435a-9220-69aeadef1746/volumes" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.277031 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-22kcv" podStartSLOduration=8.740152236 podStartE2EDuration="18.277016028s" podCreationTimestamp="2025-12-01 09:23:58 +0000 UTC" firstStartedPulling="2025-12-01 09:23:59.813929877 +0000 UTC m=+961.273316631" lastFinishedPulling="2025-12-01 09:24:09.350793669 +0000 UTC m=+970.810180423" observedRunningTime="2025-12-01 09:24:09.914272455 +0000 UTC m=+971.373659199" watchObservedRunningTime="2025-12-01 09:24:16.277016028 +0000 UTC m=+977.736402782" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.279295 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-nrm56"] Dec 01 09:24:16 crc kubenswrapper[4867]: E1201 09:24:16.279529 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d045030-b6de-435a-9220-69aeadef1746" containerName="extract-utilities" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.279545 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d045030-b6de-435a-9220-69aeadef1746" containerName="extract-utilities" Dec 01 09:24:16 crc kubenswrapper[4867]: E1201 09:24:16.279556 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d045030-b6de-435a-9220-69aeadef1746" containerName="registry-server" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.279564 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d045030-b6de-435a-9220-69aeadef1746" containerName="registry-server" Dec 01 09:24:16 crc kubenswrapper[4867]: E1201 09:24:16.279577 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d045030-b6de-435a-9220-69aeadef1746" containerName="extract-content" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.279583 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d045030-b6de-435a-9220-69aeadef1746" containerName="extract-content" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.279683 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d045030-b6de-435a-9220-69aeadef1746" containerName="registry-server" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.280280 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-nrm56" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.284094 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-s57k6" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.292328 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-4wbsd"] Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.293392 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4wbsd" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.295701 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-j68cn" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.328040 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-4wbsd"] Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.335583 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-9nc4v"] Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.336536 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-9nc4v" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.344359 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-wtdn6" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.368893 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-nrm56"] Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.369625 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pppts\" (UniqueName: \"kubernetes.io/projected/0e850850-d946-42aa-a035-1bf8dcba402f-kube-api-access-pppts\") pod \"barbican-operator-controller-manager-7d9dfd778-nrm56\" (UID: \"0e850850-d946-42aa-a035-1bf8dcba402f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-nrm56" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.369717 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grktq\" (UniqueName: \"kubernetes.io/projected/0deeeac8-147f-441c-ba67-2e6e9bc32073-kube-api-access-grktq\") pod \"designate-operator-controller-manager-78b4bc895b-9nc4v\" (UID: \"0deeeac8-147f-441c-ba67-2e6e9bc32073\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-9nc4v" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.369751 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc7kg\" (UniqueName: \"kubernetes.io/projected/c10410e7-47b2-4a48-bf7d-440a00afd4b4-kube-api-access-rc7kg\") pod \"cinder-operator-controller-manager-859b6ccc6-4wbsd\" (UID: \"c10410e7-47b2-4a48-bf7d-440a00afd4b4\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4wbsd" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.385874 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-vktv2"] Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.387050 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-vktv2" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.392254 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-wpf5m" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.400465 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-9nc4v"] Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.420870 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-vktv2"] Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.457878 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-p7rms"] Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.458895 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-p7rms" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.465291 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-nl98z" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.470008 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zdllr"] Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.470968 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zdllr" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.471852 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pppts\" (UniqueName: \"kubernetes.io/projected/0e850850-d946-42aa-a035-1bf8dcba402f-kube-api-access-pppts\") pod \"barbican-operator-controller-manager-7d9dfd778-nrm56\" (UID: \"0e850850-d946-42aa-a035-1bf8dcba402f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-nrm56" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.471897 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh46m\" (UniqueName: \"kubernetes.io/projected/68e139fd-19f5-4033-93b8-4ebf8397b510-kube-api-access-sh46m\") pod \"glance-operator-controller-manager-668d9c48b9-vktv2\" (UID: \"68e139fd-19f5-4033-93b8-4ebf8397b510\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-vktv2" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.471968 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grktq\" (UniqueName: \"kubernetes.io/projected/0deeeac8-147f-441c-ba67-2e6e9bc32073-kube-api-access-grktq\") pod \"designate-operator-controller-manager-78b4bc895b-9nc4v\" (UID: \"0deeeac8-147f-441c-ba67-2e6e9bc32073\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-9nc4v" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.471995 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc7kg\" (UniqueName: \"kubernetes.io/projected/c10410e7-47b2-4a48-bf7d-440a00afd4b4-kube-api-access-rc7kg\") pod \"cinder-operator-controller-manager-859b6ccc6-4wbsd\" (UID: \"c10410e7-47b2-4a48-bf7d-440a00afd4b4\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4wbsd" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.477461 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-996t8" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.483648 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-p7rms"] Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.530063 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grktq\" (UniqueName: \"kubernetes.io/projected/0deeeac8-147f-441c-ba67-2e6e9bc32073-kube-api-access-grktq\") pod \"designate-operator-controller-manager-78b4bc895b-9nc4v\" (UID: \"0deeeac8-147f-441c-ba67-2e6e9bc32073\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-9nc4v" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.534411 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pppts\" (UniqueName: \"kubernetes.io/projected/0e850850-d946-42aa-a035-1bf8dcba402f-kube-api-access-pppts\") pod \"barbican-operator-controller-manager-7d9dfd778-nrm56\" (UID: \"0e850850-d946-42aa-a035-1bf8dcba402f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-nrm56" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.536536 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc7kg\" (UniqueName: \"kubernetes.io/projected/c10410e7-47b2-4a48-bf7d-440a00afd4b4-kube-api-access-rc7kg\") pod \"cinder-operator-controller-manager-859b6ccc6-4wbsd\" (UID: \"c10410e7-47b2-4a48-bf7d-440a00afd4b4\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4wbsd" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.547661 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-24whr"] Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.549907 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zdllr"] Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.550024 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-24whr" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.560207 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-bgttr" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.560342 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.572932 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh46m\" (UniqueName: \"kubernetes.io/projected/68e139fd-19f5-4033-93b8-4ebf8397b510-kube-api-access-sh46m\") pod \"glance-operator-controller-manager-668d9c48b9-vktv2\" (UID: \"68e139fd-19f5-4033-93b8-4ebf8397b510\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-vktv2" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.573044 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6vsb\" (UniqueName: \"kubernetes.io/projected/468cf199-ea48-4a5a-ac34-057670369f66-kube-api-access-t6vsb\") pod \"horizon-operator-controller-manager-68c6d99b8f-p7rms\" (UID: \"468cf199-ea48-4a5a-ac34-057670369f66\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-p7rms" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.573076 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-cert\") pod \"infra-operator-controller-manager-57548d458d-24whr\" (UID: \"b5f9e64b-a7d0-4437-91ac-f84c2441cd8d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-24whr" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.573110 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzjtw\" (UniqueName: \"kubernetes.io/projected/0d369519-2f02-4efe-9deb-885362964597-kube-api-access-wzjtw\") pod \"heat-operator-controller-manager-5f64f6f8bb-zdllr\" (UID: \"0d369519-2f02-4efe-9deb-885362964597\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zdllr" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.573170 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72q8p\" (UniqueName: \"kubernetes.io/projected/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-kube-api-access-72q8p\") pod \"infra-operator-controller-manager-57548d458d-24whr\" (UID: \"b5f9e64b-a7d0-4437-91ac-f84c2441cd8d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-24whr" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.597143 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-b4j75"] Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.604417 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b4j75" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.606805 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-nrm56" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.617576 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4wbsd" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.617022 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-b4j75"] Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.625893 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-86vqz" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.628071 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-24whr"] Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.652838 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh46m\" (UniqueName: \"kubernetes.io/projected/68e139fd-19f5-4033-93b8-4ebf8397b510-kube-api-access-sh46m\") pod \"glance-operator-controller-manager-668d9c48b9-vktv2\" (UID: \"68e139fd-19f5-4033-93b8-4ebf8397b510\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-vktv2" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.653353 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-9nc4v" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.682079 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6vsb\" (UniqueName: \"kubernetes.io/projected/468cf199-ea48-4a5a-ac34-057670369f66-kube-api-access-t6vsb\") pod \"horizon-operator-controller-manager-68c6d99b8f-p7rms\" (UID: \"468cf199-ea48-4a5a-ac34-057670369f66\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-p7rms" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.682150 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-cert\") pod \"infra-operator-controller-manager-57548d458d-24whr\" (UID: \"b5f9e64b-a7d0-4437-91ac-f84c2441cd8d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-24whr" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.682202 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzjtw\" (UniqueName: \"kubernetes.io/projected/0d369519-2f02-4efe-9deb-885362964597-kube-api-access-wzjtw\") pod \"heat-operator-controller-manager-5f64f6f8bb-zdllr\" (UID: \"0d369519-2f02-4efe-9deb-885362964597\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zdllr" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.682257 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6cxx\" (UniqueName: \"kubernetes.io/projected/fd8d1846-f143-4ca0-88df-af3eca96175d-kube-api-access-v6cxx\") pod \"ironic-operator-controller-manager-6c548fd776-b4j75\" (UID: \"fd8d1846-f143-4ca0-88df-af3eca96175d\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b4j75" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.682310 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72q8p\" (UniqueName: \"kubernetes.io/projected/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-kube-api-access-72q8p\") pod \"infra-operator-controller-manager-57548d458d-24whr\" (UID: \"b5f9e64b-a7d0-4437-91ac-f84c2441cd8d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-24whr" Dec 01 09:24:16 crc kubenswrapper[4867]: E1201 09:24:16.683007 4867 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 09:24:16 crc kubenswrapper[4867]: E1201 09:24:16.683061 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-cert podName:b5f9e64b-a7d0-4437-91ac-f84c2441cd8d nodeName:}" failed. No retries permitted until 2025-12-01 09:24:17.18303902 +0000 UTC m=+978.642425774 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-cert") pod "infra-operator-controller-manager-57548d458d-24whr" (UID: "b5f9e64b-a7d0-4437-91ac-f84c2441cd8d") : secret "infra-operator-webhook-server-cert" not found Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.683461 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-492tf"] Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.697445 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-492tf" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.706684 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-mnvlr" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.716302 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzjtw\" (UniqueName: \"kubernetes.io/projected/0d369519-2f02-4efe-9deb-885362964597-kube-api-access-wzjtw\") pod \"heat-operator-controller-manager-5f64f6f8bb-zdllr\" (UID: \"0d369519-2f02-4efe-9deb-885362964597\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zdllr" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.716365 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-492tf"] Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.726508 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-vktv2" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.744604 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72q8p\" (UniqueName: \"kubernetes.io/projected/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-kube-api-access-72q8p\") pod \"infra-operator-controller-manager-57548d458d-24whr\" (UID: \"b5f9e64b-a7d0-4437-91ac-f84c2441cd8d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-24whr" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.747935 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6vsb\" (UniqueName: \"kubernetes.io/projected/468cf199-ea48-4a5a-ac34-057670369f66-kube-api-access-t6vsb\") pod \"horizon-operator-controller-manager-68c6d99b8f-p7rms\" (UID: \"468cf199-ea48-4a5a-ac34-057670369f66\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-p7rms" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.767130 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-hlksd"] Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.771804 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-hlksd" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.785760 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6cxx\" (UniqueName: \"kubernetes.io/projected/fd8d1846-f143-4ca0-88df-af3eca96175d-kube-api-access-v6cxx\") pod \"ironic-operator-controller-manager-6c548fd776-b4j75\" (UID: \"fd8d1846-f143-4ca0-88df-af3eca96175d\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b4j75" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.785853 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdpsl\" (UniqueName: \"kubernetes.io/projected/07b4a3c9-b63d-4a6f-9227-e2cd767f9d9a-kube-api-access-cdpsl\") pod \"keystone-operator-controller-manager-546d4bdf48-492tf\" (UID: \"07b4a3c9-b63d-4a6f-9227-e2cd767f9d9a\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-492tf" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.785892 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n79cg\" (UniqueName: \"kubernetes.io/projected/a8956c5b-7421-4442-8d62-773a5fe02fd0-kube-api-access-n79cg\") pod \"manila-operator-controller-manager-6546668bfd-hlksd\" (UID: \"a8956c5b-7421-4442-8d62-773a5fe02fd0\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-hlksd" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.786651 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-gtzbh" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.790398 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-p7rms" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.803508 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zdllr" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.835657 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6cxx\" (UniqueName: \"kubernetes.io/projected/fd8d1846-f143-4ca0-88df-af3eca96175d-kube-api-access-v6cxx\") pod \"ironic-operator-controller-manager-6c548fd776-b4j75\" (UID: \"fd8d1846-f143-4ca0-88df-af3eca96175d\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b4j75" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.890586 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdpsl\" (UniqueName: \"kubernetes.io/projected/07b4a3c9-b63d-4a6f-9227-e2cd767f9d9a-kube-api-access-cdpsl\") pod \"keystone-operator-controller-manager-546d4bdf48-492tf\" (UID: \"07b4a3c9-b63d-4a6f-9227-e2cd767f9d9a\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-492tf" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.890654 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n79cg\" (UniqueName: \"kubernetes.io/projected/a8956c5b-7421-4442-8d62-773a5fe02fd0-kube-api-access-n79cg\") pod \"manila-operator-controller-manager-6546668bfd-hlksd\" (UID: \"a8956c5b-7421-4442-8d62-773a5fe02fd0\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-hlksd" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.945710 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdpsl\" (UniqueName: \"kubernetes.io/projected/07b4a3c9-b63d-4a6f-9227-e2cd767f9d9a-kube-api-access-cdpsl\") pod \"keystone-operator-controller-manager-546d4bdf48-492tf\" (UID: \"07b4a3c9-b63d-4a6f-9227-e2cd767f9d9a\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-492tf" Dec 01 09:24:16 crc kubenswrapper[4867]: I1201 09:24:16.955858 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n79cg\" (UniqueName: \"kubernetes.io/projected/a8956c5b-7421-4442-8d62-773a5fe02fd0-kube-api-access-n79cg\") pod \"manila-operator-controller-manager-6546668bfd-hlksd\" (UID: \"a8956c5b-7421-4442-8d62-773a5fe02fd0\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-hlksd" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:16.999533 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-hlksd"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:16.999567 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-77sbx"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.003782 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-77sbx"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.003863 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-twg2p"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.004333 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b4j75" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.005690 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-77sbx" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.010686 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-twg2p" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.011307 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-jkzjl" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.027216 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-twg2p"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.030038 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-x8ml2" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.051104 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-492tf" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.057290 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-sjmfh"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.058613 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sjmfh" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.069888 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-sjmfh"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.077563 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-bd6vz" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.095677 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn6x5\" (UniqueName: \"kubernetes.io/projected/cb8d2624-ad08-41e7-bb2a-48bc75a2dd62-kube-api-access-hn6x5\") pod \"mariadb-operator-controller-manager-56bbcc9d85-77sbx\" (UID: \"cb8d2624-ad08-41e7-bb2a-48bc75a2dd62\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-77sbx" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.095740 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vs75\" (UniqueName: \"kubernetes.io/projected/30c79a23-86f2-4a05-adde-41ada03e2e7e-kube-api-access-6vs75\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-twg2p\" (UID: \"30c79a23-86f2-4a05-adde-41ada03e2e7e\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-twg2p" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.126462 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-mxkvs"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.127478 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mxkvs" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.136890 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-k4f2p" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.144207 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-mxkvs"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.151309 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-hlksd" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.152447 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.153675 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.156917 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-gd7c8" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.169361 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.198240 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn6x5\" (UniqueName: \"kubernetes.io/projected/cb8d2624-ad08-41e7-bb2a-48bc75a2dd62-kube-api-access-hn6x5\") pod \"mariadb-operator-controller-manager-56bbcc9d85-77sbx\" (UID: \"cb8d2624-ad08-41e7-bb2a-48bc75a2dd62\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-77sbx" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.198303 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf772\" (UniqueName: \"kubernetes.io/projected/656a9362-30cf-43f6-9909-95859bef129e-kube-api-access-nf772\") pod \"nova-operator-controller-manager-697bc559fc-sjmfh\" (UID: \"656a9362-30cf-43f6-9909-95859bef129e\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sjmfh" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.198341 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vs75\" (UniqueName: \"kubernetes.io/projected/30c79a23-86f2-4a05-adde-41ada03e2e7e-kube-api-access-6vs75\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-twg2p\" (UID: \"30c79a23-86f2-4a05-adde-41ada03e2e7e\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-twg2p" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.198386 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-cert\") pod \"infra-operator-controller-manager-57548d458d-24whr\" (UID: \"b5f9e64b-a7d0-4437-91ac-f84c2441cd8d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-24whr" Dec 01 09:24:17 crc kubenswrapper[4867]: E1201 09:24:17.198554 4867 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 09:24:17 crc kubenswrapper[4867]: E1201 09:24:17.198609 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-cert podName:b5f9e64b-a7d0-4437-91ac-f84c2441cd8d nodeName:}" failed. No retries permitted until 2025-12-01 09:24:18.198589581 +0000 UTC m=+979.657976335 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-cert") pod "infra-operator-controller-manager-57548d458d-24whr" (UID: "b5f9e64b-a7d0-4437-91ac-f84c2441cd8d") : secret "infra-operator-webhook-server-cert" not found Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.216063 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-grrzp"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.232121 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-grrzp" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.236541 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-7zxn7" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.238671 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vs75\" (UniqueName: \"kubernetes.io/projected/30c79a23-86f2-4a05-adde-41ada03e2e7e-kube-api-access-6vs75\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-twg2p\" (UID: \"30c79a23-86f2-4a05-adde-41ada03e2e7e\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-twg2p" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.244389 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn6x5\" (UniqueName: \"kubernetes.io/projected/cb8d2624-ad08-41e7-bb2a-48bc75a2dd62-kube-api-access-hn6x5\") pod \"mariadb-operator-controller-manager-56bbcc9d85-77sbx\" (UID: \"cb8d2624-ad08-41e7-bb2a-48bc75a2dd62\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-77sbx" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.276898 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-grrzp"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.301859 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf772\" (UniqueName: \"kubernetes.io/projected/656a9362-30cf-43f6-9909-95859bef129e-kube-api-access-nf772\") pod \"nova-operator-controller-manager-697bc559fc-sjmfh\" (UID: \"656a9362-30cf-43f6-9909-95859bef129e\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sjmfh" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.301913 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d73996e-90d0-44f5-85f9-3800f54fc3d7-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4\" (UID: \"4d73996e-90d0-44f5-85f9-3800f54fc3d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.301942 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs5k7\" (UniqueName: \"kubernetes.io/projected/4d73996e-90d0-44f5-85f9-3800f54fc3d7-kube-api-access-bs5k7\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4\" (UID: \"4d73996e-90d0-44f5-85f9-3800f54fc3d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.301969 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7m7h\" (UniqueName: \"kubernetes.io/projected/e9fd074d-b9bc-4215-bfd7-56df604f101c-kube-api-access-g7m7h\") pod \"octavia-operator-controller-manager-998648c74-mxkvs\" (UID: \"e9fd074d-b9bc-4215-bfd7-56df604f101c\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-mxkvs" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.319474 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-bhgk8"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.320498 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bhgk8" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.321779 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf772\" (UniqueName: \"kubernetes.io/projected/656a9362-30cf-43f6-9909-95859bef129e-kube-api-access-nf772\") pod \"nova-operator-controller-manager-697bc559fc-sjmfh\" (UID: \"656a9362-30cf-43f6-9909-95859bef129e\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sjmfh" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.324852 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-wxsff" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.330608 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j698r"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.332492 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j698r" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.355751 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-n7dd8" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.391414 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-l7jwc"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.393592 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-l7jwc" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.398432 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-77sbx" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.405688 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-77xrc" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.405924 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d73996e-90d0-44f5-85f9-3800f54fc3d7-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4\" (UID: \"4d73996e-90d0-44f5-85f9-3800f54fc3d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.405970 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs5k7\" (UniqueName: \"kubernetes.io/projected/4d73996e-90d0-44f5-85f9-3800f54fc3d7-kube-api-access-bs5k7\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4\" (UID: \"4d73996e-90d0-44f5-85f9-3800f54fc3d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.406002 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7m7h\" (UniqueName: \"kubernetes.io/projected/e9fd074d-b9bc-4215-bfd7-56df604f101c-kube-api-access-g7m7h\") pod \"octavia-operator-controller-manager-998648c74-mxkvs\" (UID: \"e9fd074d-b9bc-4215-bfd7-56df604f101c\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-mxkvs" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.406082 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2wxg\" (UniqueName: \"kubernetes.io/projected/ffbd9e52-147b-42cd-abaa-ec7d1341b826-kube-api-access-l2wxg\") pod \"placement-operator-controller-manager-78f8948974-bhgk8\" (UID: \"ffbd9e52-147b-42cd-abaa-ec7d1341b826\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-bhgk8" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.406121 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8wh4\" (UniqueName: \"kubernetes.io/projected/461764b0-73a3-4866-aec1-e687293591e3-kube-api-access-r8wh4\") pod \"ovn-operator-controller-manager-b6456fdb6-grrzp\" (UID: \"461764b0-73a3-4866-aec1-e687293591e3\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-grrzp" Dec 01 09:24:17 crc kubenswrapper[4867]: E1201 09:24:17.406256 4867 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:24:17 crc kubenswrapper[4867]: E1201 09:24:17.406574 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d73996e-90d0-44f5-85f9-3800f54fc3d7-cert podName:4d73996e-90d0-44f5-85f9-3800f54fc3d7 nodeName:}" failed. No retries permitted until 2025-12-01 09:24:17.906560361 +0000 UTC m=+979.365947115 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4d73996e-90d0-44f5-85f9-3800f54fc3d7-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" (UID: "4d73996e-90d0-44f5-85f9-3800f54fc3d7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.415500 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-twg2p" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.437089 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j698r"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.441137 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sjmfh" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.447150 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.448884 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7m7h\" (UniqueName: \"kubernetes.io/projected/e9fd074d-b9bc-4215-bfd7-56df604f101c-kube-api-access-g7m7h\") pod \"octavia-operator-controller-manager-998648c74-mxkvs\" (UID: \"e9fd074d-b9bc-4215-bfd7-56df604f101c\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-mxkvs" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.456378 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs5k7\" (UniqueName: \"kubernetes.io/projected/4d73996e-90d0-44f5-85f9-3800f54fc3d7-kube-api-access-bs5k7\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4\" (UID: \"4d73996e-90d0-44f5-85f9-3800f54fc3d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.462095 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-swrc5"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.463324 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-swrc5" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.470504 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-g2ddn"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.471697 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g2ddn" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.475171 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5gtcl" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.475803 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-nkz7f" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.495531 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-bhgk8"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.500526 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mxkvs" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.508510 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2wxg\" (UniqueName: \"kubernetes.io/projected/ffbd9e52-147b-42cd-abaa-ec7d1341b826-kube-api-access-l2wxg\") pod \"placement-operator-controller-manager-78f8948974-bhgk8\" (UID: \"ffbd9e52-147b-42cd-abaa-ec7d1341b826\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-bhgk8" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.508566 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c6lq\" (UniqueName: \"kubernetes.io/projected/f3176675-0a3a-4fd2-9727-349ec1b88de7-kube-api-access-9c6lq\") pod \"swift-operator-controller-manager-5f8c65bbfc-j698r\" (UID: \"f3176675-0a3a-4fd2-9727-349ec1b88de7\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j698r" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.508590 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8wh4\" (UniqueName: \"kubernetes.io/projected/461764b0-73a3-4866-aec1-e687293591e3-kube-api-access-r8wh4\") pod \"ovn-operator-controller-manager-b6456fdb6-grrzp\" (UID: \"461764b0-73a3-4866-aec1-e687293591e3\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-grrzp" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.508625 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk2dw\" (UniqueName: \"kubernetes.io/projected/9be92a6c-afef-449e-927b-8d0732a2140a-kube-api-access-dk2dw\") pod \"telemetry-operator-controller-manager-76cc84c6bb-l7jwc\" (UID: \"9be92a6c-afef-449e-927b-8d0732a2140a\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-l7jwc" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.512110 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-swrc5"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.611550 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c6lq\" (UniqueName: \"kubernetes.io/projected/f3176675-0a3a-4fd2-9727-349ec1b88de7-kube-api-access-9c6lq\") pod \"swift-operator-controller-manager-5f8c65bbfc-j698r\" (UID: \"f3176675-0a3a-4fd2-9727-349ec1b88de7\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j698r" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.612121 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk2dw\" (UniqueName: \"kubernetes.io/projected/9be92a6c-afef-449e-927b-8d0732a2140a-kube-api-access-dk2dw\") pod \"telemetry-operator-controller-manager-76cc84c6bb-l7jwc\" (UID: \"9be92a6c-afef-449e-927b-8d0732a2140a\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-l7jwc" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.612198 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z42d\" (UniqueName: \"kubernetes.io/projected/573accdf-9cb1-4643-af86-744e695a1f9d-kube-api-access-5z42d\") pod \"test-operator-controller-manager-5854674fcc-g2ddn\" (UID: \"573accdf-9cb1-4643-af86-744e695a1f9d\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-g2ddn" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.612228 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84k79\" (UniqueName: \"kubernetes.io/projected/c900776b-c7ea-4e4d-9b6b-00245cf048ce-kube-api-access-84k79\") pod \"watcher-operator-controller-manager-769dc69bc-swrc5\" (UID: \"c900776b-c7ea-4e4d-9b6b-00245cf048ce\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-swrc5" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.620077 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-l7jwc"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.675779 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk2dw\" (UniqueName: \"kubernetes.io/projected/9be92a6c-afef-449e-927b-8d0732a2140a-kube-api-access-dk2dw\") pod \"telemetry-operator-controller-manager-76cc84c6bb-l7jwc\" (UID: \"9be92a6c-afef-449e-927b-8d0732a2140a\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-l7jwc" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.676286 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2wxg\" (UniqueName: \"kubernetes.io/projected/ffbd9e52-147b-42cd-abaa-ec7d1341b826-kube-api-access-l2wxg\") pod \"placement-operator-controller-manager-78f8948974-bhgk8\" (UID: \"ffbd9e52-147b-42cd-abaa-ec7d1341b826\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-bhgk8" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.688700 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-g2ddn"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.693599 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8wh4\" (UniqueName: \"kubernetes.io/projected/461764b0-73a3-4866-aec1-e687293591e3-kube-api-access-r8wh4\") pod \"ovn-operator-controller-manager-b6456fdb6-grrzp\" (UID: \"461764b0-73a3-4866-aec1-e687293591e3\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-grrzp" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.694072 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c6lq\" (UniqueName: \"kubernetes.io/projected/f3176675-0a3a-4fd2-9727-349ec1b88de7-kube-api-access-9c6lq\") pod \"swift-operator-controller-manager-5f8c65bbfc-j698r\" (UID: \"f3176675-0a3a-4fd2-9727-349ec1b88de7\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j698r" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.713957 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z42d\" (UniqueName: \"kubernetes.io/projected/573accdf-9cb1-4643-af86-744e695a1f9d-kube-api-access-5z42d\") pod \"test-operator-controller-manager-5854674fcc-g2ddn\" (UID: \"573accdf-9cb1-4643-af86-744e695a1f9d\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-g2ddn" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.714307 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84k79\" (UniqueName: \"kubernetes.io/projected/c900776b-c7ea-4e4d-9b6b-00245cf048ce-kube-api-access-84k79\") pod \"watcher-operator-controller-manager-769dc69bc-swrc5\" (UID: \"c900776b-c7ea-4e4d-9b6b-00245cf048ce\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-swrc5" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.743296 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-l7jwc" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.760653 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z42d\" (UniqueName: \"kubernetes.io/projected/573accdf-9cb1-4643-af86-744e695a1f9d-kube-api-access-5z42d\") pod \"test-operator-controller-manager-5854674fcc-g2ddn\" (UID: \"573accdf-9cb1-4643-af86-744e695a1f9d\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-g2ddn" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.793649 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84k79\" (UniqueName: \"kubernetes.io/projected/c900776b-c7ea-4e4d-9b6b-00245cf048ce-kube-api-access-84k79\") pod \"watcher-operator-controller-manager-769dc69bc-swrc5\" (UID: \"c900776b-c7ea-4e4d-9b6b-00245cf048ce\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-swrc5" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.811390 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-swrc5" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.813600 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.814675 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.818871 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.825420 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-fzlkq" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.826062 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.863874 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q"] Dec 01 09:24:17 crc kubenswrapper[4867]: W1201 09:24:17.866295 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e850850_d946_42aa_a035_1bf8dcba402f.slice/crio-df1f78e46cae6122757c69ee6657d372ad5a5eaee08d92273f828b536682fcb7 WatchSource:0}: Error finding container df1f78e46cae6122757c69ee6657d372ad5a5eaee08d92273f828b536682fcb7: Status 404 returned error can't find the container with id df1f78e46cae6122757c69ee6657d372ad5a5eaee08d92273f828b536682fcb7 Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.874178 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-grrzp" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.922521 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm7b7\" (UniqueName: \"kubernetes.io/projected/860dbd82-4e88-4090-8ce6-658e3201ef67-kube-api-access-nm7b7\") pod \"openstack-operator-controller-manager-56cfc94774-wn77q\" (UID: \"860dbd82-4e88-4090-8ce6-658e3201ef67\") " pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.922699 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d73996e-90d0-44f5-85f9-3800f54fc3d7-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4\" (UID: \"4d73996e-90d0-44f5-85f9-3800f54fc3d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.922866 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-webhook-certs\") pod \"openstack-operator-controller-manager-56cfc94774-wn77q\" (UID: \"860dbd82-4e88-4090-8ce6-658e3201ef67\") " pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.922913 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-metrics-certs\") pod \"openstack-operator-controller-manager-56cfc94774-wn77q\" (UID: \"860dbd82-4e88-4090-8ce6-658e3201ef67\") " pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:17 crc kubenswrapper[4867]: E1201 09:24:17.924010 4867 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:24:17 crc kubenswrapper[4867]: E1201 09:24:17.924099 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d73996e-90d0-44f5-85f9-3800f54fc3d7-cert podName:4d73996e-90d0-44f5-85f9-3800f54fc3d7 nodeName:}" failed. No retries permitted until 2025-12-01 09:24:18.924077904 +0000 UTC m=+980.383464658 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4d73996e-90d0-44f5-85f9-3800f54fc3d7-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" (UID: "4d73996e-90d0-44f5-85f9-3800f54fc3d7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.955109 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-68x8r"] Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.956047 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-68x8r" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.967700 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-btspw" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.968433 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bhgk8" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.981883 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j698r" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.990657 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g2ddn" Dec 01 09:24:17 crc kubenswrapper[4867]: I1201 09:24:17.994896 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-68x8r"] Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.025516 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-webhook-certs\") pod \"openstack-operator-controller-manager-56cfc94774-wn77q\" (UID: \"860dbd82-4e88-4090-8ce6-658e3201ef67\") " pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.025567 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-metrics-certs\") pod \"openstack-operator-controller-manager-56cfc94774-wn77q\" (UID: \"860dbd82-4e88-4090-8ce6-658e3201ef67\") " pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.025607 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm7b7\" (UniqueName: \"kubernetes.io/projected/860dbd82-4e88-4090-8ce6-658e3201ef67-kube-api-access-nm7b7\") pod \"openstack-operator-controller-manager-56cfc94774-wn77q\" (UID: \"860dbd82-4e88-4090-8ce6-658e3201ef67\") " pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.025725 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg2kj\" (UniqueName: \"kubernetes.io/projected/bb2d9cc0-c5d4-4abe-874e-8ce801c6cbdf-kube-api-access-mg2kj\") pod \"rabbitmq-cluster-operator-manager-668c99d594-68x8r\" (UID: \"bb2d9cc0-c5d4-4abe-874e-8ce801c6cbdf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-68x8r" Dec 01 09:24:18 crc kubenswrapper[4867]: E1201 09:24:18.028133 4867 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 09:24:18 crc kubenswrapper[4867]: E1201 09:24:18.028187 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-webhook-certs podName:860dbd82-4e88-4090-8ce6-658e3201ef67 nodeName:}" failed. No retries permitted until 2025-12-01 09:24:18.528171896 +0000 UTC m=+979.987558650 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-webhook-certs") pod "openstack-operator-controller-manager-56cfc94774-wn77q" (UID: "860dbd82-4e88-4090-8ce6-658e3201ef67") : secret "webhook-server-cert" not found Dec 01 09:24:18 crc kubenswrapper[4867]: E1201 09:24:18.028389 4867 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 09:24:18 crc kubenswrapper[4867]: E1201 09:24:18.028411 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-metrics-certs podName:860dbd82-4e88-4090-8ce6-658e3201ef67 nodeName:}" failed. No retries permitted until 2025-12-01 09:24:18.528404334 +0000 UTC m=+979.987791088 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-metrics-certs") pod "openstack-operator-controller-manager-56cfc94774-wn77q" (UID: "860dbd82-4e88-4090-8ce6-658e3201ef67") : secret "metrics-server-cert" not found Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.066195 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-nrm56"] Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.068549 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm7b7\" (UniqueName: \"kubernetes.io/projected/860dbd82-4e88-4090-8ce6-658e3201ef67-kube-api-access-nm7b7\") pod \"openstack-operator-controller-manager-56cfc94774-wn77q\" (UID: \"860dbd82-4e88-4090-8ce6-658e3201ef67\") " pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.133102 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg2kj\" (UniqueName: \"kubernetes.io/projected/bb2d9cc0-c5d4-4abe-874e-8ce801c6cbdf-kube-api-access-mg2kj\") pod \"rabbitmq-cluster-operator-manager-668c99d594-68x8r\" (UID: \"bb2d9cc0-c5d4-4abe-874e-8ce801c6cbdf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-68x8r" Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.183473 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-nrm56" event={"ID":"0e850850-d946-42aa-a035-1bf8dcba402f","Type":"ContainerStarted","Data":"df1f78e46cae6122757c69ee6657d372ad5a5eaee08d92273f828b536682fcb7"} Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.184792 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg2kj\" (UniqueName: \"kubernetes.io/projected/bb2d9cc0-c5d4-4abe-874e-8ce801c6cbdf-kube-api-access-mg2kj\") pod \"rabbitmq-cluster-operator-manager-668c99d594-68x8r\" (UID: \"bb2d9cc0-c5d4-4abe-874e-8ce801c6cbdf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-68x8r" Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.239891 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-cert\") pod \"infra-operator-controller-manager-57548d458d-24whr\" (UID: \"b5f9e64b-a7d0-4437-91ac-f84c2441cd8d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-24whr" Dec 01 09:24:18 crc kubenswrapper[4867]: E1201 09:24:18.240041 4867 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 09:24:18 crc kubenswrapper[4867]: E1201 09:24:18.240087 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-cert podName:b5f9e64b-a7d0-4437-91ac-f84c2441cd8d nodeName:}" failed. No retries permitted until 2025-12-01 09:24:20.240074152 +0000 UTC m=+981.699460906 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-cert") pod "infra-operator-controller-manager-57548d458d-24whr" (UID: "b5f9e64b-a7d0-4437-91ac-f84c2441cd8d") : secret "infra-operator-webhook-server-cert" not found Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.318710 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-4wbsd"] Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.369381 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-68x8r" Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.544331 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-metrics-certs\") pod \"openstack-operator-controller-manager-56cfc94774-wn77q\" (UID: \"860dbd82-4e88-4090-8ce6-658e3201ef67\") " pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.545002 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-webhook-certs\") pod \"openstack-operator-controller-manager-56cfc94774-wn77q\" (UID: \"860dbd82-4e88-4090-8ce6-658e3201ef67\") " pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:18 crc kubenswrapper[4867]: E1201 09:24:18.545208 4867 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 09:24:18 crc kubenswrapper[4867]: E1201 09:24:18.545263 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-webhook-certs podName:860dbd82-4e88-4090-8ce6-658e3201ef67 nodeName:}" failed. No retries permitted until 2025-12-01 09:24:19.545248069 +0000 UTC m=+981.004634823 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-webhook-certs") pod "openstack-operator-controller-manager-56cfc94774-wn77q" (UID: "860dbd82-4e88-4090-8ce6-658e3201ef67") : secret "webhook-server-cert" not found Dec 01 09:24:18 crc kubenswrapper[4867]: E1201 09:24:18.545576 4867 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 09:24:18 crc kubenswrapper[4867]: E1201 09:24:18.545600 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-metrics-certs podName:860dbd82-4e88-4090-8ce6-658e3201ef67 nodeName:}" failed. No retries permitted until 2025-12-01 09:24:19.545593038 +0000 UTC m=+981.004979792 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-metrics-certs") pod "openstack-operator-controller-manager-56cfc94774-wn77q" (UID: "860dbd82-4e88-4090-8ce6-658e3201ef67") : secret "metrics-server-cert" not found Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.677040 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-22kcv" Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.677775 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-22kcv" Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.748649 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-22kcv" Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.820094 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-p7rms"] Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.887061 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-hlksd"] Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.901955 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-9nc4v"] Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.952974 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d73996e-90d0-44f5-85f9-3800f54fc3d7-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4\" (UID: \"4d73996e-90d0-44f5-85f9-3800f54fc3d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" Dec 01 09:24:18 crc kubenswrapper[4867]: E1201 09:24:18.953195 4867 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:24:18 crc kubenswrapper[4867]: E1201 09:24:18.953250 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d73996e-90d0-44f5-85f9-3800f54fc3d7-cert podName:4d73996e-90d0-44f5-85f9-3800f54fc3d7 nodeName:}" failed. No retries permitted until 2025-12-01 09:24:20.953232894 +0000 UTC m=+982.412619648 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4d73996e-90d0-44f5-85f9-3800f54fc3d7-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" (UID: "4d73996e-90d0-44f5-85f9-3800f54fc3d7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:24:18 crc kubenswrapper[4867]: I1201 09:24:18.961525 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-b4j75"] Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.023421 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-vktv2"] Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.065275 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zdllr"] Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.084873 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-492tf"] Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.088689 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g7m7h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-mxkvs_openstack-operators(e9fd074d-b9bc-4215-bfd7-56df604f101c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.090270 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-swrc5"] Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.092137 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g7m7h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-mxkvs_openstack-operators(e9fd074d-b9bc-4215-bfd7-56df604f101c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.093230 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mxkvs" podUID="e9fd074d-b9bc-4215-bfd7-56df604f101c" Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.095662 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-mxkvs"] Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.100302 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-sjmfh"] Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.112221 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-twg2p"] Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.117210 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-77sbx"] Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.197038 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-p7rms" event={"ID":"468cf199-ea48-4a5a-ac34-057670369f66","Type":"ContainerStarted","Data":"896c3ce91b6674a7c62c192d47216e79b3953373b49f8b3f9fd09604a0f1626d"} Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.197987 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4wbsd" event={"ID":"c10410e7-47b2-4a48-bf7d-440a00afd4b4","Type":"ContainerStarted","Data":"37381bb055d67f623033d97f93e98f60168ffadb3d9b52108bb9f6d1810c96c3"} Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.201248 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-twg2p" event={"ID":"30c79a23-86f2-4a05-adde-41ada03e2e7e","Type":"ContainerStarted","Data":"581880ca22f9519f28becc923787191fbebc02c3681213d1a2ef49d6bbaffe9b"} Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.202427 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sjmfh" event={"ID":"656a9362-30cf-43f6-9909-95859bef129e","Type":"ContainerStarted","Data":"ac77f0da6e659226d498abd9ea4d7e43eff4276db9f7a852de4a1e1be98c5f07"} Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.203470 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b4j75" event={"ID":"fd8d1846-f143-4ca0-88df-af3eca96175d","Type":"ContainerStarted","Data":"d53c43522d8c17ff4680726c6d742aa93aca9b3635393cff6f894ebb8cf47240"} Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.208652 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-vktv2" event={"ID":"68e139fd-19f5-4033-93b8-4ebf8397b510","Type":"ContainerStarted","Data":"63840acd984af3765350b54ff78377f4e5c66e3885c08c5dc67db5ce04336dc5"} Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.209903 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-492tf" event={"ID":"07b4a3c9-b63d-4a6f-9227-e2cd767f9d9a","Type":"ContainerStarted","Data":"a2bf5304b8ded70e196c5922692f54122173e5d8ac03661b20ca107ed24523a7"} Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.212856 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-77sbx" event={"ID":"cb8d2624-ad08-41e7-bb2a-48bc75a2dd62","Type":"ContainerStarted","Data":"8112671251b2e50c572642134b1ab23c4d338b4e637ee18325c4e5e7e015790b"} Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.223295 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zdllr" event={"ID":"0d369519-2f02-4efe-9deb-885362964597","Type":"ContainerStarted","Data":"633b6cc9f3dd7842408dc5a71337f7e195d6e83d1431cc86b82caba2d9415667"} Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.226679 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mxkvs" event={"ID":"e9fd074d-b9bc-4215-bfd7-56df604f101c","Type":"ContainerStarted","Data":"50e44366a2e881635f4f41fd11986d46733e597e550ec5fc3ff2783597328540"} Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.236986 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mxkvs" podUID="e9fd074d-b9bc-4215-bfd7-56df604f101c" Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.237212 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-hlksd" event={"ID":"a8956c5b-7421-4442-8d62-773a5fe02fd0","Type":"ContainerStarted","Data":"7766d84dffb3dcc4dcb3fe1da8437b95ba23c10fd2419a0cc94fdcd4eba54088"} Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.255217 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r8wh4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-grrzp_openstack-operators(461764b0-73a3-4866-aec1-e687293591e3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.255281 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-grrzp"] Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.255307 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-swrc5" event={"ID":"c900776b-c7ea-4e4d-9b6b-00245cf048ce","Type":"ContainerStarted","Data":"0a15624cf9c4a9ab98b37062508d7d3f0ef361c38a36fa257dbc2d050a3fb390"} Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.259363 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-9nc4v" event={"ID":"0deeeac8-147f-441c-ba67-2e6e9bc32073","Type":"ContainerStarted","Data":"126315d3b6a4b65c408e252ba35abae63c30288b5c870b72c625b7f375a85142"} Dec 01 09:24:19 crc kubenswrapper[4867]: W1201 09:24:19.269903 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3176675_0a3a_4fd2_9727_349ec1b88de7.slice/crio-4a41d715216dff153d0bc493ac699468222dc21342a74031d0ca099edc7ac312 WatchSource:0}: Error finding container 4a41d715216dff153d0bc493ac699468222dc21342a74031d0ca099edc7ac312: Status 404 returned error can't find the container with id 4a41d715216dff153d0bc493ac699468222dc21342a74031d0ca099edc7ac312 Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.276340 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j698r"] Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.284262 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-l7jwc"] Dec 01 09:24:19 crc kubenswrapper[4867]: W1201 09:24:19.284722 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod573accdf_9cb1_4643_af86_744e695a1f9d.slice/crio-0bc109b2c807e4432dd3b335b552e51b4c9bc1e019415c21e5f9e9b86166849f WatchSource:0}: Error finding container 0bc109b2c807e4432dd3b335b552e51b4c9bc1e019415c21e5f9e9b86166849f: Status 404 returned error can't find the container with id 0bc109b2c807e4432dd3b335b552e51b4c9bc1e019415c21e5f9e9b86166849f Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.295225 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9c6lq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-j698r_openstack-operators(f3176675-0a3a-4fd2-9727-349ec1b88de7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.295306 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5z42d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-g2ddn_openstack-operators(573accdf-9cb1-4643-af86-744e695a1f9d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.296627 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dk2dw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-l7jwc_openstack-operators(9be92a6c-afef-449e-927b-8d0732a2140a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.297472 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9c6lq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-j698r_openstack-operators(f3176675-0a3a-4fd2-9727-349ec1b88de7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.298164 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5z42d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-g2ddn_openstack-operators(573accdf-9cb1-4643-af86-744e695a1f9d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.298632 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j698r" podUID="f3176675-0a3a-4fd2-9727-349ec1b88de7" Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.298785 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dk2dw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-l7jwc_openstack-operators(9be92a6c-afef-449e-927b-8d0732a2140a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.299262 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g2ddn" podUID="573accdf-9cb1-4643-af86-744e695a1f9d" Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.301329 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l2wxg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-bhgk8_openstack-operators(ffbd9e52-147b-42cd-abaa-ec7d1341b826): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.301416 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-l7jwc" podUID="9be92a6c-afef-449e-927b-8d0732a2140a" Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.307988 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-g2ddn"] Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.308121 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l2wxg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-bhgk8_openstack-operators(ffbd9e52-147b-42cd-abaa-ec7d1341b826): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.309217 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bhgk8" podUID="ffbd9e52-147b-42cd-abaa-ec7d1341b826" Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.314226 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-bhgk8"] Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.317189 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-22kcv" Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.413195 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-68x8r"] Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.418834 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mg2kj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-68x8r_openstack-operators(bb2d9cc0-c5d4-4abe-874e-8ce801c6cbdf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.420058 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-68x8r" podUID="bb2d9cc0-c5d4-4abe-874e-8ce801c6cbdf" Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.431985 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-22kcv"] Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.534429 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h5cxt"] Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.534709 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h5cxt" podUID="4415af65-5e2b-472d-b687-54fa137bf02e" containerName="registry-server" containerID="cri-o://11d0a161c0127c9e149173a3485333ff793f9e6e359fa9a68deb1af63faa761a" gracePeriod=2 Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.593919 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-webhook-certs\") pod \"openstack-operator-controller-manager-56cfc94774-wn77q\" (UID: \"860dbd82-4e88-4090-8ce6-658e3201ef67\") " pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:19 crc kubenswrapper[4867]: I1201 09:24:19.593974 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-metrics-certs\") pod \"openstack-operator-controller-manager-56cfc94774-wn77q\" (UID: \"860dbd82-4e88-4090-8ce6-658e3201ef67\") " pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.595613 4867 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.595701 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-webhook-certs podName:860dbd82-4e88-4090-8ce6-658e3201ef67 nodeName:}" failed. No retries permitted until 2025-12-01 09:24:21.595683443 +0000 UTC m=+983.055070197 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-webhook-certs") pod "openstack-operator-controller-manager-56cfc94774-wn77q" (UID: "860dbd82-4e88-4090-8ce6-658e3201ef67") : secret "webhook-server-cert" not found Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.595970 4867 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 09:24:19 crc kubenswrapper[4867]: E1201 09:24:19.596052 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-metrics-certs podName:860dbd82-4e88-4090-8ce6-658e3201ef67 nodeName:}" failed. No retries permitted until 2025-12-01 09:24:21.596034252 +0000 UTC m=+983.055421076 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-metrics-certs") pod "openstack-operator-controller-manager-56cfc94774-wn77q" (UID: "860dbd82-4e88-4090-8ce6-658e3201ef67") : secret "metrics-server-cert" not found Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.120978 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5cxt" Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.215243 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4415af65-5e2b-472d-b687-54fa137bf02e-catalog-content\") pod \"4415af65-5e2b-472d-b687-54fa137bf02e\" (UID: \"4415af65-5e2b-472d-b687-54fa137bf02e\") " Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.215373 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn58n\" (UniqueName: \"kubernetes.io/projected/4415af65-5e2b-472d-b687-54fa137bf02e-kube-api-access-tn58n\") pod \"4415af65-5e2b-472d-b687-54fa137bf02e\" (UID: \"4415af65-5e2b-472d-b687-54fa137bf02e\") " Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.215508 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4415af65-5e2b-472d-b687-54fa137bf02e-utilities\") pod \"4415af65-5e2b-472d-b687-54fa137bf02e\" (UID: \"4415af65-5e2b-472d-b687-54fa137bf02e\") " Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.216698 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4415af65-5e2b-472d-b687-54fa137bf02e-utilities" (OuterVolumeSpecName: "utilities") pod "4415af65-5e2b-472d-b687-54fa137bf02e" (UID: "4415af65-5e2b-472d-b687-54fa137bf02e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.223133 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4415af65-5e2b-472d-b687-54fa137bf02e-kube-api-access-tn58n" (OuterVolumeSpecName: "kube-api-access-tn58n") pod "4415af65-5e2b-472d-b687-54fa137bf02e" (UID: "4415af65-5e2b-472d-b687-54fa137bf02e"). InnerVolumeSpecName "kube-api-access-tn58n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.317270 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-cert\") pod \"infra-operator-controller-manager-57548d458d-24whr\" (UID: \"b5f9e64b-a7d0-4437-91ac-f84c2441cd8d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-24whr" Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.317410 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn58n\" (UniqueName: \"kubernetes.io/projected/4415af65-5e2b-472d-b687-54fa137bf02e-kube-api-access-tn58n\") on node \"crc\" DevicePath \"\"" Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.317427 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4415af65-5e2b-472d-b687-54fa137bf02e-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:24:20 crc kubenswrapper[4867]: E1201 09:24:20.317532 4867 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 09:24:20 crc kubenswrapper[4867]: E1201 09:24:20.317587 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-cert podName:b5f9e64b-a7d0-4437-91ac-f84c2441cd8d nodeName:}" failed. No retries permitted until 2025-12-01 09:24:24.317568716 +0000 UTC m=+985.776955470 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-cert") pod "infra-operator-controller-manager-57548d458d-24whr" (UID: "b5f9e64b-a7d0-4437-91ac-f84c2441cd8d") : secret "infra-operator-webhook-server-cert" not found Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.328195 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4415af65-5e2b-472d-b687-54fa137bf02e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4415af65-5e2b-472d-b687-54fa137bf02e" (UID: "4415af65-5e2b-472d-b687-54fa137bf02e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.352919 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g2ddn" event={"ID":"573accdf-9cb1-4643-af86-744e695a1f9d","Type":"ContainerStarted","Data":"0bc109b2c807e4432dd3b335b552e51b4c9bc1e019415c21e5f9e9b86166849f"} Dec 01 09:24:20 crc kubenswrapper[4867]: E1201 09:24:20.362726 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g2ddn" podUID="573accdf-9cb1-4643-af86-744e695a1f9d" Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.362936 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bhgk8" event={"ID":"ffbd9e52-147b-42cd-abaa-ec7d1341b826","Type":"ContainerStarted","Data":"16077c9d26936bafd07b1a3f7c9b58e3959019429ba8d79b4609650a26f1040b"} Dec 01 09:24:20 crc kubenswrapper[4867]: E1201 09:24:20.381331 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bhgk8" podUID="ffbd9e52-147b-42cd-abaa-ec7d1341b826" Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.381505 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j698r" event={"ID":"f3176675-0a3a-4fd2-9727-349ec1b88de7","Type":"ContainerStarted","Data":"4a41d715216dff153d0bc493ac699468222dc21342a74031d0ca099edc7ac312"} Dec 01 09:24:20 crc kubenswrapper[4867]: E1201 09:24:20.396035 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j698r" podUID="f3176675-0a3a-4fd2-9727-349ec1b88de7" Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.424015 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-grrzp" event={"ID":"461764b0-73a3-4866-aec1-e687293591e3","Type":"ContainerStarted","Data":"eccd972a636af2e998c976186933ceca80f9e262ec17fdade74254bdac607303"} Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.424085 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4415af65-5e2b-472d-b687-54fa137bf02e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.464015 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-l7jwc" event={"ID":"9be92a6c-afef-449e-927b-8d0732a2140a","Type":"ContainerStarted","Data":"88e97f26b098237b8d4ebcb689255f9b645237df4b837f35b2dc964deec84970"} Dec 01 09:24:20 crc kubenswrapper[4867]: E1201 09:24:20.501295 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-l7jwc" podUID="9be92a6c-afef-449e-927b-8d0732a2140a" Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.511133 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-68x8r" event={"ID":"bb2d9cc0-c5d4-4abe-874e-8ce801c6cbdf","Type":"ContainerStarted","Data":"47aba58e2fbb983e69122cee69f1e9ddb669cab74bdd05fe51c44367eceb63c2"} Dec 01 09:24:20 crc kubenswrapper[4867]: E1201 09:24:20.520081 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-68x8r" podUID="bb2d9cc0-c5d4-4abe-874e-8ce801c6cbdf" Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.594944 4867 generic.go:334] "Generic (PLEG): container finished" podID="4415af65-5e2b-472d-b687-54fa137bf02e" containerID="11d0a161c0127c9e149173a3485333ff793f9e6e359fa9a68deb1af63faa761a" exitCode=0 Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.595611 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5cxt" Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.608041 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5cxt" event={"ID":"4415af65-5e2b-472d-b687-54fa137bf02e","Type":"ContainerDied","Data":"11d0a161c0127c9e149173a3485333ff793f9e6e359fa9a68deb1af63faa761a"} Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.608127 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5cxt" event={"ID":"4415af65-5e2b-472d-b687-54fa137bf02e","Type":"ContainerDied","Data":"684ee07a07d5c496930fc507ac5f80fc8d78deb7876e90c2024be1f38a796409"} Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.608167 4867 scope.go:117] "RemoveContainer" containerID="11d0a161c0127c9e149173a3485333ff793f9e6e359fa9a68deb1af63faa761a" Dec 01 09:24:20 crc kubenswrapper[4867]: E1201 09:24:20.629853 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mxkvs" podUID="e9fd074d-b9bc-4215-bfd7-56df604f101c" Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.688967 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h5cxt"] Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.697216 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h5cxt"] Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.843701 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4415af65-5e2b-472d-b687-54fa137bf02e" path="/var/lib/kubelet/pods/4415af65-5e2b-472d-b687-54fa137bf02e/volumes" Dec 01 09:24:20 crc kubenswrapper[4867]: I1201 09:24:20.955191 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d73996e-90d0-44f5-85f9-3800f54fc3d7-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4\" (UID: \"4d73996e-90d0-44f5-85f9-3800f54fc3d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" Dec 01 09:24:20 crc kubenswrapper[4867]: E1201 09:24:20.955421 4867 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:24:20 crc kubenswrapper[4867]: E1201 09:24:20.955475 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d73996e-90d0-44f5-85f9-3800f54fc3d7-cert podName:4d73996e-90d0-44f5-85f9-3800f54fc3d7 nodeName:}" failed. No retries permitted until 2025-12-01 09:24:24.955458278 +0000 UTC m=+986.414845032 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4d73996e-90d0-44f5-85f9-3800f54fc3d7-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" (UID: "4d73996e-90d0-44f5-85f9-3800f54fc3d7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:24:21 crc kubenswrapper[4867]: E1201 09:24:21.612425 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-68x8r" podUID="bb2d9cc0-c5d4-4abe-874e-8ce801c6cbdf" Dec 01 09:24:21 crc kubenswrapper[4867]: E1201 09:24:21.617090 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j698r" podUID="f3176675-0a3a-4fd2-9727-349ec1b88de7" Dec 01 09:24:21 crc kubenswrapper[4867]: E1201 09:24:21.623478 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g2ddn" podUID="573accdf-9cb1-4643-af86-744e695a1f9d" Dec 01 09:24:21 crc kubenswrapper[4867]: E1201 09:24:21.623712 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bhgk8" podUID="ffbd9e52-147b-42cd-abaa-ec7d1341b826" Dec 01 09:24:21 crc kubenswrapper[4867]: E1201 09:24:21.623865 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-l7jwc" podUID="9be92a6c-afef-449e-927b-8d0732a2140a" Dec 01 09:24:21 crc kubenswrapper[4867]: I1201 09:24:21.666994 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-webhook-certs\") pod \"openstack-operator-controller-manager-56cfc94774-wn77q\" (UID: \"860dbd82-4e88-4090-8ce6-658e3201ef67\") " pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:21 crc kubenswrapper[4867]: I1201 09:24:21.667047 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-metrics-certs\") pod \"openstack-operator-controller-manager-56cfc94774-wn77q\" (UID: \"860dbd82-4e88-4090-8ce6-658e3201ef67\") " pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:21 crc kubenswrapper[4867]: E1201 09:24:21.667276 4867 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 09:24:21 crc kubenswrapper[4867]: E1201 09:24:21.667354 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-webhook-certs podName:860dbd82-4e88-4090-8ce6-658e3201ef67 nodeName:}" failed. No retries permitted until 2025-12-01 09:24:25.667332876 +0000 UTC m=+987.126719690 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-webhook-certs") pod "openstack-operator-controller-manager-56cfc94774-wn77q" (UID: "860dbd82-4e88-4090-8ce6-658e3201ef67") : secret "webhook-server-cert" not found Dec 01 09:24:21 crc kubenswrapper[4867]: E1201 09:24:21.668056 4867 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 09:24:21 crc kubenswrapper[4867]: E1201 09:24:21.668093 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-metrics-certs podName:860dbd82-4e88-4090-8ce6-658e3201ef67 nodeName:}" failed. No retries permitted until 2025-12-01 09:24:25.668082886 +0000 UTC m=+987.127469710 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-metrics-certs") pod "openstack-operator-controller-manager-56cfc94774-wn77q" (UID: "860dbd82-4e88-4090-8ce6-658e3201ef67") : secret "metrics-server-cert" not found Dec 01 09:24:23 crc kubenswrapper[4867]: I1201 09:24:23.005595 4867 scope.go:117] "RemoveContainer" containerID="120c5351b81d9a84e66f1739aa68a940f5b090874a2da2a6625d688637e5ab61" Dec 01 09:24:23 crc kubenswrapper[4867]: I1201 09:24:23.392730 4867 scope.go:117] "RemoveContainer" containerID="beaf11dadb669322482eaec9bd11270e5487bccc34fccab5586d3dcb111fce8d" Dec 01 09:24:24 crc kubenswrapper[4867]: I1201 09:24:24.320324 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-cert\") pod \"infra-operator-controller-manager-57548d458d-24whr\" (UID: \"b5f9e64b-a7d0-4437-91ac-f84c2441cd8d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-24whr" Dec 01 09:24:24 crc kubenswrapper[4867]: E1201 09:24:24.320673 4867 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 09:24:24 crc kubenswrapper[4867]: E1201 09:24:24.320967 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-cert podName:b5f9e64b-a7d0-4437-91ac-f84c2441cd8d nodeName:}" failed. No retries permitted until 2025-12-01 09:24:32.320950956 +0000 UTC m=+993.780337710 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-cert") pod "infra-operator-controller-manager-57548d458d-24whr" (UID: "b5f9e64b-a7d0-4437-91ac-f84c2441cd8d") : secret "infra-operator-webhook-server-cert" not found Dec 01 09:24:25 crc kubenswrapper[4867]: I1201 09:24:25.032164 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d73996e-90d0-44f5-85f9-3800f54fc3d7-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4\" (UID: \"4d73996e-90d0-44f5-85f9-3800f54fc3d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" Dec 01 09:24:25 crc kubenswrapper[4867]: E1201 09:24:25.032396 4867 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:24:25 crc kubenswrapper[4867]: E1201 09:24:25.032502 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d73996e-90d0-44f5-85f9-3800f54fc3d7-cert podName:4d73996e-90d0-44f5-85f9-3800f54fc3d7 nodeName:}" failed. No retries permitted until 2025-12-01 09:24:33.032478404 +0000 UTC m=+994.491865168 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4d73996e-90d0-44f5-85f9-3800f54fc3d7-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" (UID: "4d73996e-90d0-44f5-85f9-3800f54fc3d7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:24:25 crc kubenswrapper[4867]: I1201 09:24:25.743879 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-webhook-certs\") pod \"openstack-operator-controller-manager-56cfc94774-wn77q\" (UID: \"860dbd82-4e88-4090-8ce6-658e3201ef67\") " pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:25 crc kubenswrapper[4867]: I1201 09:24:25.743934 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-metrics-certs\") pod \"openstack-operator-controller-manager-56cfc94774-wn77q\" (UID: \"860dbd82-4e88-4090-8ce6-658e3201ef67\") " pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:25 crc kubenswrapper[4867]: E1201 09:24:25.744157 4867 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 09:24:25 crc kubenswrapper[4867]: E1201 09:24:25.744213 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-metrics-certs podName:860dbd82-4e88-4090-8ce6-658e3201ef67 nodeName:}" failed. No retries permitted until 2025-12-01 09:24:33.744198158 +0000 UTC m=+995.203584912 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-metrics-certs") pod "openstack-operator-controller-manager-56cfc94774-wn77q" (UID: "860dbd82-4e88-4090-8ce6-658e3201ef67") : secret "metrics-server-cert" not found Dec 01 09:24:25 crc kubenswrapper[4867]: E1201 09:24:25.744536 4867 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 09:24:25 crc kubenswrapper[4867]: E1201 09:24:25.744560 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-webhook-certs podName:860dbd82-4e88-4090-8ce6-658e3201ef67 nodeName:}" failed. No retries permitted until 2025-12-01 09:24:33.744552817 +0000 UTC m=+995.203939571 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-webhook-certs") pod "openstack-operator-controller-manager-56cfc94774-wn77q" (UID: "860dbd82-4e88-4090-8ce6-658e3201ef67") : secret "webhook-server-cert" not found Dec 01 09:24:32 crc kubenswrapper[4867]: I1201 09:24:32.336014 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-cert\") pod \"infra-operator-controller-manager-57548d458d-24whr\" (UID: \"b5f9e64b-a7d0-4437-91ac-f84c2441cd8d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-24whr" Dec 01 09:24:32 crc kubenswrapper[4867]: I1201 09:24:32.346556 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5f9e64b-a7d0-4437-91ac-f84c2441cd8d-cert\") pod \"infra-operator-controller-manager-57548d458d-24whr\" (UID: \"b5f9e64b-a7d0-4437-91ac-f84c2441cd8d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-24whr" Dec 01 09:24:32 crc kubenswrapper[4867]: I1201 09:24:32.477305 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-24whr" Dec 01 09:24:33 crc kubenswrapper[4867]: I1201 09:24:33.047213 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d73996e-90d0-44f5-85f9-3800f54fc3d7-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4\" (UID: \"4d73996e-90d0-44f5-85f9-3800f54fc3d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" Dec 01 09:24:33 crc kubenswrapper[4867]: I1201 09:24:33.051665 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d73996e-90d0-44f5-85f9-3800f54fc3d7-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4\" (UID: \"4d73996e-90d0-44f5-85f9-3800f54fc3d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" Dec 01 09:24:33 crc kubenswrapper[4867]: I1201 09:24:33.116800 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" Dec 01 09:24:33 crc kubenswrapper[4867]: I1201 09:24:33.757188 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-webhook-certs\") pod \"openstack-operator-controller-manager-56cfc94774-wn77q\" (UID: \"860dbd82-4e88-4090-8ce6-658e3201ef67\") " pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:33 crc kubenswrapper[4867]: I1201 09:24:33.757249 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-metrics-certs\") pod \"openstack-operator-controller-manager-56cfc94774-wn77q\" (UID: \"860dbd82-4e88-4090-8ce6-658e3201ef67\") " pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:33 crc kubenswrapper[4867]: I1201 09:24:33.768833 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-metrics-certs\") pod \"openstack-operator-controller-manager-56cfc94774-wn77q\" (UID: \"860dbd82-4e88-4090-8ce6-658e3201ef67\") " pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:33 crc kubenswrapper[4867]: I1201 09:24:33.769180 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/860dbd82-4e88-4090-8ce6-658e3201ef67-webhook-certs\") pod \"openstack-operator-controller-manager-56cfc94774-wn77q\" (UID: \"860dbd82-4e88-4090-8ce6-658e3201ef67\") " pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:33 crc kubenswrapper[4867]: I1201 09:24:33.921973 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:24:37 crc kubenswrapper[4867]: E1201 09:24:37.894534 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 01 09:24:37 crc kubenswrapper[4867]: E1201 09:24:37.895068 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hn6x5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-77sbx_openstack-operators(cb8d2624-ad08-41e7-bb2a-48bc75a2dd62): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:24:39 crc kubenswrapper[4867]: E1201 09:24:39.604430 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 01 09:24:39 crc kubenswrapper[4867]: E1201 09:24:39.604935 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t6vsb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-p7rms_openstack-operators(468cf199-ea48-4a5a-ac34-057670369f66): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:24:41 crc kubenswrapper[4867]: E1201 09:24:41.034566 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:440cde33d3a2a0c545cd1c110a3634eb85544370f448865b97a13c38034b0172" Dec 01 09:24:41 crc kubenswrapper[4867]: E1201 09:24:41.034741 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:440cde33d3a2a0c545cd1c110a3634eb85544370f448865b97a13c38034b0172,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sh46m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-668d9c48b9-vktv2_openstack-operators(68e139fd-19f5-4033-93b8-4ebf8397b510): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:24:43 crc kubenswrapper[4867]: E1201 09:24:43.738970 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 01 09:24:43 crc kubenswrapper[4867]: E1201 09:24:43.739440 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wzjtw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-zdllr_openstack-operators(0d369519-2f02-4efe-9deb-885362964597): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:24:45 crc kubenswrapper[4867]: E1201 09:24:45.822562 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621" Dec 01 09:24:45 crc kubenswrapper[4867]: E1201 09:24:45.822988 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-84k79,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-swrc5_openstack-operators(c900776b-c7ea-4e4d-9b6b-00245cf048ce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:24:46 crc kubenswrapper[4867]: E1201 09:24:46.483354 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530" Dec 01 09:24:46 crc kubenswrapper[4867]: E1201 09:24:46.483869 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v6cxx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-b4j75_openstack-operators(fd8d1846-f143-4ca0-88df-af3eca96175d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:24:46 crc kubenswrapper[4867]: E1201 09:24:46.977532 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85" Dec 01 09:24:46 crc kubenswrapper[4867]: E1201 09:24:46.977760 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-grktq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-9nc4v_openstack-operators(0deeeac8-147f-441c-ba67-2e6e9bc32073): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:24:47 crc kubenswrapper[4867]: E1201 09:24:47.477529 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801" Dec 01 09:24:47 crc kubenswrapper[4867]: E1201 09:24:47.477710 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rc7kg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-4wbsd_openstack-operators(c10410e7-47b2-4a48-bf7d-440a00afd4b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:24:51 crc kubenswrapper[4867]: E1201 09:24:51.091901 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 01 09:24:51 crc kubenswrapper[4867]: E1201 09:24:51.092114 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6vs75,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-twg2p_openstack-operators(30c79a23-86f2-4a05-adde-41ada03e2e7e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:24:51 crc kubenswrapper[4867]: E1201 09:24:51.550087 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:ecf7be921850bdc04697ed1b332bab39ad2a64e4e45c2a445c04f9bae6ac61b5" Dec 01 09:24:51 crc kubenswrapper[4867]: E1201 09:24:51.550288 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:ecf7be921850bdc04697ed1b332bab39ad2a64e4e45c2a445c04f9bae6ac61b5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n79cg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-6546668bfd-hlksd_openstack-operators(a8956c5b-7421-4442-8d62-773a5fe02fd0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:24:52 crc kubenswrapper[4867]: E1201 09:24:52.111449 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d" Dec 01 09:24:52 crc kubenswrapper[4867]: E1201 09:24:52.112506 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9c6lq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-j698r_openstack-operators(f3176675-0a3a-4fd2-9727-349ec1b88de7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:24:54 crc kubenswrapper[4867]: E1201 09:24:54.787418 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Dec 01 09:24:54 crc kubenswrapper[4867]: E1201 09:24:54.787989 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5z42d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-g2ddn_openstack-operators(573accdf-9cb1-4643-af86-744e695a1f9d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:24:55 crc kubenswrapper[4867]: E1201 09:24:55.708529 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 01 09:24:55 crc kubenswrapper[4867]: E1201 09:24:55.708736 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hn6x5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-77sbx_openstack-operators(cb8d2624-ad08-41e7-bb2a-48bc75a2dd62): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 01 09:24:55 crc kubenswrapper[4867]: E1201 09:24:55.710746 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-77sbx" podUID="cb8d2624-ad08-41e7-bb2a-48bc75a2dd62" Dec 01 09:24:57 crc kubenswrapper[4867]: E1201 09:24:57.614405 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 01 09:24:57 crc kubenswrapper[4867]: E1201 09:24:57.614878 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nf772,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-sjmfh_openstack-operators(656a9362-30cf-43f6-9909-95859bef129e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:24:58 crc kubenswrapper[4867]: E1201 09:24:58.038781 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385" Dec 01 09:24:58 crc kubenswrapper[4867]: E1201 09:24:58.039023 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dk2dw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-l7jwc_openstack-operators(9be92a6c-afef-449e-927b-8d0732a2140a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:24:59 crc kubenswrapper[4867]: I1201 09:24:59.187650 4867 scope.go:117] "RemoveContainer" containerID="beaf11dadb669322482eaec9bd11270e5487bccc34fccab5586d3dcb111fce8d" Dec 01 09:25:00 crc kubenswrapper[4867]: E1201 09:25:00.115875 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 01 09:25:00 crc kubenswrapper[4867]: E1201 09:25:00.116120 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l2wxg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-bhgk8_openstack-operators(ffbd9e52-147b-42cd-abaa-ec7d1341b826): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:25:00 crc kubenswrapper[4867]: E1201 09:25:00.676454 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 01 09:25:00 crc kubenswrapper[4867]: E1201 09:25:00.676877 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g7m7h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-mxkvs_openstack-operators(e9fd074d-b9bc-4215-bfd7-56df604f101c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:25:00 crc kubenswrapper[4867]: I1201 09:25:00.677048 4867 scope.go:117] "RemoveContainer" containerID="11d0a161c0127c9e149173a3485333ff793f9e6e359fa9a68deb1af63faa761a" Dec 01 09:25:00 crc kubenswrapper[4867]: E1201 09:25:00.677413 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11d0a161c0127c9e149173a3485333ff793f9e6e359fa9a68deb1af63faa761a\": container with ID starting with 11d0a161c0127c9e149173a3485333ff793f9e6e359fa9a68deb1af63faa761a not found: ID does not exist" containerID="11d0a161c0127c9e149173a3485333ff793f9e6e359fa9a68deb1af63faa761a" Dec 01 09:25:00 crc kubenswrapper[4867]: I1201 09:25:00.677447 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11d0a161c0127c9e149173a3485333ff793f9e6e359fa9a68deb1af63faa761a"} err="failed to get container status \"11d0a161c0127c9e149173a3485333ff793f9e6e359fa9a68deb1af63faa761a\": rpc error: code = NotFound desc = could not find container \"11d0a161c0127c9e149173a3485333ff793f9e6e359fa9a68deb1af63faa761a\": container with ID starting with 11d0a161c0127c9e149173a3485333ff793f9e6e359fa9a68deb1af63faa761a not found: ID does not exist" Dec 01 09:25:00 crc kubenswrapper[4867]: I1201 09:25:00.677471 4867 scope.go:117] "RemoveContainer" containerID="120c5351b81d9a84e66f1739aa68a940f5b090874a2da2a6625d688637e5ab61" Dec 01 09:25:00 crc kubenswrapper[4867]: E1201 09:25:00.677737 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"120c5351b81d9a84e66f1739aa68a940f5b090874a2da2a6625d688637e5ab61\": container with ID starting with 120c5351b81d9a84e66f1739aa68a940f5b090874a2da2a6625d688637e5ab61 not found: ID does not exist" containerID="120c5351b81d9a84e66f1739aa68a940f5b090874a2da2a6625d688637e5ab61" Dec 01 09:25:00 crc kubenswrapper[4867]: I1201 09:25:00.677758 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"120c5351b81d9a84e66f1739aa68a940f5b090874a2da2a6625d688637e5ab61"} err="failed to get container status \"120c5351b81d9a84e66f1739aa68a940f5b090874a2da2a6625d688637e5ab61\": rpc error: code = NotFound desc = could not find container \"120c5351b81d9a84e66f1739aa68a940f5b090874a2da2a6625d688637e5ab61\": container with ID starting with 120c5351b81d9a84e66f1739aa68a940f5b090874a2da2a6625d688637e5ab61 not found: ID does not exist" Dec 01 09:25:00 crc kubenswrapper[4867]: I1201 09:25:00.677771 4867 scope.go:117] "RemoveContainer" containerID="beaf11dadb669322482eaec9bd11270e5487bccc34fccab5586d3dcb111fce8d" Dec 01 09:25:00 crc kubenswrapper[4867]: E1201 09:25:00.678509 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beaf11dadb669322482eaec9bd11270e5487bccc34fccab5586d3dcb111fce8d\": container with ID starting with beaf11dadb669322482eaec9bd11270e5487bccc34fccab5586d3dcb111fce8d not found: ID does not exist" containerID="beaf11dadb669322482eaec9bd11270e5487bccc34fccab5586d3dcb111fce8d" Dec 01 09:25:00 crc kubenswrapper[4867]: I1201 09:25:00.678533 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beaf11dadb669322482eaec9bd11270e5487bccc34fccab5586d3dcb111fce8d"} err="failed to get container status \"beaf11dadb669322482eaec9bd11270e5487bccc34fccab5586d3dcb111fce8d\": rpc error: code = NotFound desc = could not find container \"beaf11dadb669322482eaec9bd11270e5487bccc34fccab5586d3dcb111fce8d\": container with ID starting with beaf11dadb669322482eaec9bd11270e5487bccc34fccab5586d3dcb111fce8d not found: ID does not exist" Dec 01 09:25:02 crc kubenswrapper[4867]: E1201 09:25:02.868917 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3" Dec 01 09:25:02 crc kubenswrapper[4867]: E1201 09:25:02.869415 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cdpsl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-546d4bdf48-492tf_openstack-operators(07b4a3c9-b63d-4a6f-9227-e2cd767f9d9a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:25:03 crc kubenswrapper[4867]: I1201 09:25:03.351112 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q"] Dec 01 09:25:05 crc kubenswrapper[4867]: E1201 09:25:05.601055 4867 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_extract-utilities_community-operators-h5cxt_openshift-marketplace_4415af65-5e2b-472d-b687-54fa137bf02e_0 in pod sandbox 684ee07a07d5c496930fc507ac5f80fc8d78deb7876e90c2024be1f38a796409 from index: no such id: 'beaf11dadb669322482eaec9bd11270e5487bccc34fccab5586d3dcb111fce8d'" containerID="beaf11dadb669322482eaec9bd11270e5487bccc34fccab5586d3dcb111fce8d" Dec 01 09:25:05 crc kubenswrapper[4867]: E1201 09:25:05.601125 4867 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = Unknown desc = failed to delete container k8s_extract-utilities_community-operators-h5cxt_openshift-marketplace_4415af65-5e2b-472d-b687-54fa137bf02e_0 in pod sandbox 684ee07a07d5c496930fc507ac5f80fc8d78deb7876e90c2024be1f38a796409 from index: no such id: 'beaf11dadb669322482eaec9bd11270e5487bccc34fccab5586d3dcb111fce8d'" containerID="beaf11dadb669322482eaec9bd11270e5487bccc34fccab5586d3dcb111fce8d" Dec 01 09:25:05 crc kubenswrapper[4867]: E1201 09:25:05.641073 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 01 09:25:05 crc kubenswrapper[4867]: E1201 09:25:05.641647 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mg2kj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-68x8r_openstack-operators(bb2d9cc0-c5d4-4abe-874e-8ce801c6cbdf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:25:05 crc kubenswrapper[4867]: E1201 09:25:05.642800 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-68x8r" podUID="bb2d9cc0-c5d4-4abe-874e-8ce801c6cbdf" Dec 01 09:25:05 crc kubenswrapper[4867]: I1201 09:25:05.873197 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-24whr"] Dec 01 09:25:05 crc kubenswrapper[4867]: I1201 09:25:05.945720 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" event={"ID":"860dbd82-4e88-4090-8ce6-658e3201ef67","Type":"ContainerStarted","Data":"65712e6251461cbcedfd1f246f3866307bc7c325fa2d9da35d95cc206236491e"} Dec 01 09:25:06 crc kubenswrapper[4867]: E1201 09:25:06.192905 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 01 09:25:06 crc kubenswrapper[4867]: E1201 09:25:06.193320 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9c6lq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-j698r_openstack-operators(f3176675-0a3a-4fd2-9727-349ec1b88de7): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 01 09:25:06 crc kubenswrapper[4867]: E1201 09:25:06.194511 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j698r" podUID="f3176675-0a3a-4fd2-9727-349ec1b88de7" Dec 01 09:25:06 crc kubenswrapper[4867]: E1201 09:25:06.265417 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 01 09:25:06 crc kubenswrapper[4867]: E1201 09:25:06.265600 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r8wh4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-grrzp_openstack-operators(461764b0-73a3-4866-aec1-e687293591e3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:25:06 crc kubenswrapper[4867]: E1201 09:25:06.267839 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-grrzp" podUID="461764b0-73a3-4866-aec1-e687293591e3" Dec 01 09:25:06 crc kubenswrapper[4867]: E1201 09:25:06.617636 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-p7rms" podUID="468cf199-ea48-4a5a-ac34-057670369f66" Dec 01 09:25:06 crc kubenswrapper[4867]: E1201 09:25:06.628913 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sjmfh" podUID="656a9362-30cf-43f6-9909-95859bef129e" Dec 01 09:25:06 crc kubenswrapper[4867]: I1201 09:25:06.697902 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4"] Dec 01 09:25:06 crc kubenswrapper[4867]: E1201 09:25:06.813594 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-hlksd" podUID="a8956c5b-7421-4442-8d62-773a5fe02fd0" Dec 01 09:25:06 crc kubenswrapper[4867]: I1201 09:25:06.967570 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" event={"ID":"4d73996e-90d0-44f5-85f9-3800f54fc3d7","Type":"ContainerStarted","Data":"58fa712e950323c097fd3e562140c7ad799a9e8a6aecfc5381d46d1d1646f2df"} Dec 01 09:25:06 crc kubenswrapper[4867]: I1201 09:25:06.973235 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-24whr" event={"ID":"b5f9e64b-a7d0-4437-91ac-f84c2441cd8d","Type":"ContainerStarted","Data":"2d0ae91dfe730ea24e7e70d6ce0856995b4797d7d917c10557c31c260be53e65"} Dec 01 09:25:06 crc kubenswrapper[4867]: E1201 09:25:06.979976 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bhgk8" podUID="ffbd9e52-147b-42cd-abaa-ec7d1341b826" Dec 01 09:25:06 crc kubenswrapper[4867]: I1201 09:25:06.982586 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-p7rms" event={"ID":"468cf199-ea48-4a5a-ac34-057670369f66","Type":"ContainerStarted","Data":"8ae69a3e5a829e18928c25a110ed7967adbcffc2ea965509557b0c523bbbc305"} Dec 01 09:25:06 crc kubenswrapper[4867]: E1201 09:25:06.984016 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zdllr" podUID="0d369519-2f02-4efe-9deb-885362964597" Dec 01 09:25:06 crc kubenswrapper[4867]: I1201 09:25:06.995301 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-nrm56" event={"ID":"0e850850-d946-42aa-a035-1bf8dcba402f","Type":"ContainerStarted","Data":"10d338d705f1b02e5f5df169e7b0aef33e676c650f1ae285b74ff524309557ab"} Dec 01 09:25:06 crc kubenswrapper[4867]: I1201 09:25:06.995336 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-nrm56" event={"ID":"0e850850-d946-42aa-a035-1bf8dcba402f","Type":"ContainerStarted","Data":"8422c63d4e2f9f3161fa44683a8efc0e46220f787f3d02894b4d19517d482221"} Dec 01 09:25:06 crc kubenswrapper[4867]: I1201 09:25:06.995917 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-nrm56" Dec 01 09:25:07 crc kubenswrapper[4867]: I1201 09:25:07.114224 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-77sbx" event={"ID":"cb8d2624-ad08-41e7-bb2a-48bc75a2dd62","Type":"ContainerStarted","Data":"edc78f36deca9ebe361f568ba4a22383ec7c6e0707cd7308f17303e7288b2249"} Dec 01 09:25:07 crc kubenswrapper[4867]: I1201 09:25:07.131966 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-nrm56" podStartSLOduration=16.94956598 podStartE2EDuration="51.131929524s" podCreationTimestamp="2025-12-01 09:24:16 +0000 UTC" firstStartedPulling="2025-12-01 09:24:17.915425672 +0000 UTC m=+979.374812426" lastFinishedPulling="2025-12-01 09:24:52.097789216 +0000 UTC m=+1013.557175970" observedRunningTime="2025-12-01 09:25:07.127313159 +0000 UTC m=+1028.586699913" watchObservedRunningTime="2025-12-01 09:25:07.131929524 +0000 UTC m=+1028.591316278" Dec 01 09:25:07 crc kubenswrapper[4867]: I1201 09:25:07.160247 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" event={"ID":"860dbd82-4e88-4090-8ce6-658e3201ef67","Type":"ContainerStarted","Data":"3d662df2199df57af41837b960dd0975ac48d7b6817c49fa893d2802679a40a8"} Dec 01 09:25:07 crc kubenswrapper[4867]: I1201 09:25:07.160325 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:25:07 crc kubenswrapper[4867]: I1201 09:25:07.180161 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sjmfh" event={"ID":"656a9362-30cf-43f6-9909-95859bef129e","Type":"ContainerStarted","Data":"d91a3120f4d7ce71f68e6e334f6ce4aef621870492df6d1bf3f8f45fc8bd7d0b"} Dec 01 09:25:07 crc kubenswrapper[4867]: I1201 09:25:07.197366 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" podStartSLOduration=50.197341558 podStartE2EDuration="50.197341558s" podCreationTimestamp="2025-12-01 09:24:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:25:07.190847004 +0000 UTC m=+1028.650233748" watchObservedRunningTime="2025-12-01 09:25:07.197341558 +0000 UTC m=+1028.656728302" Dec 01 09:25:07 crc kubenswrapper[4867]: I1201 09:25:07.199139 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-hlksd" event={"ID":"a8956c5b-7421-4442-8d62-773a5fe02fd0","Type":"ContainerStarted","Data":"de90047969f1bc8e272289526436af943dceed09242163e59a096c4a3fe2dd83"} Dec 01 09:25:07 crc kubenswrapper[4867]: E1201 09:25:07.204657 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sjmfh" podUID="656a9362-30cf-43f6-9909-95859bef129e" Dec 01 09:25:07 crc kubenswrapper[4867]: E1201 09:25:07.532782 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mxkvs" podUID="e9fd074d-b9bc-4215-bfd7-56df604f101c" Dec 01 09:25:07 crc kubenswrapper[4867]: E1201 09:25:07.538586 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-9nc4v" podUID="0deeeac8-147f-441c-ba67-2e6e9bc32073" Dec 01 09:25:07 crc kubenswrapper[4867]: E1201 09:25:07.538687 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-l7jwc" podUID="9be92a6c-afef-449e-927b-8d0732a2140a" Dec 01 09:25:07 crc kubenswrapper[4867]: E1201 09:25:07.562254 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-twg2p" podUID="30c79a23-86f2-4a05-adde-41ada03e2e7e" Dec 01 09:25:07 crc kubenswrapper[4867]: E1201 09:25:07.562393 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g2ddn" podUID="573accdf-9cb1-4643-af86-744e695a1f9d" Dec 01 09:25:07 crc kubenswrapper[4867]: E1201 09:25:07.562432 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b4j75" podUID="fd8d1846-f143-4ca0-88df-af3eca96175d" Dec 01 09:25:07 crc kubenswrapper[4867]: E1201 09:25:07.572179 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-492tf" podUID="07b4a3c9-b63d-4a6f-9227-e2cd767f9d9a" Dec 01 09:25:07 crc kubenswrapper[4867]: E1201 09:25:07.572242 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-swrc5" podUID="c900776b-c7ea-4e4d-9b6b-00245cf048ce" Dec 01 09:25:07 crc kubenswrapper[4867]: E1201 09:25:07.576357 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4wbsd" podUID="c10410e7-47b2-4a48-bf7d-440a00afd4b4" Dec 01 09:25:07 crc kubenswrapper[4867]: E1201 09:25:07.576502 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-vktv2" podUID="68e139fd-19f5-4033-93b8-4ebf8397b510" Dec 01 09:25:08 crc kubenswrapper[4867]: I1201 09:25:08.207028 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4wbsd" event={"ID":"c10410e7-47b2-4a48-bf7d-440a00afd4b4","Type":"ContainerStarted","Data":"e37cbfa47a146afd2a1a96cf8e304344be8b50feceb3063c0b8de1c4c92687dc"} Dec 01 09:25:08 crc kubenswrapper[4867]: I1201 09:25:08.211601 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zdllr" event={"ID":"0d369519-2f02-4efe-9deb-885362964597","Type":"ContainerStarted","Data":"b6d96650f567d0d0bd7866929a6165699268d0e9c610ecf4e827515748e92805"} Dec 01 09:25:08 crc kubenswrapper[4867]: I1201 09:25:08.214435 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-twg2p" event={"ID":"30c79a23-86f2-4a05-adde-41ada03e2e7e","Type":"ContainerStarted","Data":"726f6eb5420aed5d95afa07f20b9a5c998672b4925255f00a739115e98369e9d"} Dec 01 09:25:08 crc kubenswrapper[4867]: I1201 09:25:08.217052 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-9nc4v" event={"ID":"0deeeac8-147f-441c-ba67-2e6e9bc32073","Type":"ContainerStarted","Data":"bed3ac7fafe5d57ad574db8c9430a4d7f58d4d645f7be13ea10a65ed009f269f"} Dec 01 09:25:08 crc kubenswrapper[4867]: I1201 09:25:08.227235 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-492tf" event={"ID":"07b4a3c9-b63d-4a6f-9227-e2cd767f9d9a","Type":"ContainerStarted","Data":"a86059f4a0c6b9d34ba4f0c159a0d90330591c77c41383acd1630b9deb97e528"} Dec 01 09:25:08 crc kubenswrapper[4867]: E1201 09:25:08.228331 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-492tf" podUID="07b4a3c9-b63d-4a6f-9227-e2cd767f9d9a" Dec 01 09:25:08 crc kubenswrapper[4867]: I1201 09:25:08.236721 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bhgk8" event={"ID":"ffbd9e52-147b-42cd-abaa-ec7d1341b826","Type":"ContainerStarted","Data":"b261fe06212aa4e7e9f301aa5836d5c11551214e7b676afcf2410c292c82ec84"} Dec 01 09:25:08 crc kubenswrapper[4867]: E1201 09:25:08.239220 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bhgk8" podUID="ffbd9e52-147b-42cd-abaa-ec7d1341b826" Dec 01 09:25:08 crc kubenswrapper[4867]: I1201 09:25:08.260566 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mxkvs" event={"ID":"e9fd074d-b9bc-4215-bfd7-56df604f101c","Type":"ContainerStarted","Data":"22c0e6e9194ba90ac9b83696f8b93817060b24d1ac082bc48316ded09a57348d"} Dec 01 09:25:08 crc kubenswrapper[4867]: I1201 09:25:08.262622 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-l7jwc" event={"ID":"9be92a6c-afef-449e-927b-8d0732a2140a","Type":"ContainerStarted","Data":"fa7c5747eee76a8ddd62fb542a84324b7fd8243bf53e17cbc2714b7fb4f4578c"} Dec 01 09:25:08 crc kubenswrapper[4867]: E1201 09:25:08.263725 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-l7jwc" podUID="9be92a6c-afef-449e-927b-8d0732a2140a" Dec 01 09:25:08 crc kubenswrapper[4867]: I1201 09:25:08.264977 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-swrc5" event={"ID":"c900776b-c7ea-4e4d-9b6b-00245cf048ce","Type":"ContainerStarted","Data":"e3df375a38e6294c1d4aa7244833b0273001c14a79a86c6180edd69b65dbe2c2"} Dec 01 09:25:08 crc kubenswrapper[4867]: I1201 09:25:08.274377 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b4j75" event={"ID":"fd8d1846-f143-4ca0-88df-af3eca96175d","Type":"ContainerStarted","Data":"3c40fee8a1a9c5081c5807e543cfae238452c908d01f13522d12835a66fc9200"} Dec 01 09:25:08 crc kubenswrapper[4867]: I1201 09:25:08.284730 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-vktv2" event={"ID":"68e139fd-19f5-4033-93b8-4ebf8397b510","Type":"ContainerStarted","Data":"c71b29f2461e0d4227a54e2444d02bd4463a427d7800d5ac3c32afe08e9cbdc1"} Dec 01 09:25:08 crc kubenswrapper[4867]: E1201 09:25:08.289392 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mxkvs" podUID="e9fd074d-b9bc-4215-bfd7-56df604f101c" Dec 01 09:25:08 crc kubenswrapper[4867]: I1201 09:25:08.322031 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g2ddn" event={"ID":"573accdf-9cb1-4643-af86-744e695a1f9d","Type":"ContainerStarted","Data":"06ea08b9f9a8edec42fe7314bffd0a23036e35bb500b76eb96c6b1a77ca3955d"} Dec 01 09:25:08 crc kubenswrapper[4867]: E1201 09:25:08.327018 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g2ddn" podUID="573accdf-9cb1-4643-af86-744e695a1f9d" Dec 01 09:25:08 crc kubenswrapper[4867]: I1201 09:25:08.351528 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-77sbx" event={"ID":"cb8d2624-ad08-41e7-bb2a-48bc75a2dd62","Type":"ContainerStarted","Data":"751be7fafab826c059b882fa59f18719786edc628faebf81df3dc61a9745e8e0"} Dec 01 09:25:08 crc kubenswrapper[4867]: I1201 09:25:08.351561 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-77sbx" Dec 01 09:25:08 crc kubenswrapper[4867]: I1201 09:25:08.851309 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-77sbx" podStartSLOduration=5.761883793 podStartE2EDuration="52.85128745s" podCreationTimestamp="2025-12-01 09:24:16 +0000 UTC" firstStartedPulling="2025-12-01 09:24:19.088561914 +0000 UTC m=+980.547948668" lastFinishedPulling="2025-12-01 09:25:06.177965571 +0000 UTC m=+1027.637352325" observedRunningTime="2025-12-01 09:25:08.539261708 +0000 UTC m=+1029.998648462" watchObservedRunningTime="2025-12-01 09:25:08.85128745 +0000 UTC m=+1030.310674204" Dec 01 09:25:09 crc kubenswrapper[4867]: E1201 09:25:09.389173 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-492tf" podUID="07b4a3c9-b63d-4a6f-9227-e2cd767f9d9a" Dec 01 09:25:10 crc kubenswrapper[4867]: I1201 09:25:10.372042 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-p7rms" event={"ID":"468cf199-ea48-4a5a-ac34-057670369f66","Type":"ContainerStarted","Data":"bafcb9c51a39fa8023d83c8f9a2c797c616c05c6c0b264593a6dde0ffa836cf6"} Dec 01 09:25:10 crc kubenswrapper[4867]: I1201 09:25:10.372194 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-p7rms" Dec 01 09:25:10 crc kubenswrapper[4867]: I1201 09:25:10.389857 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-p7rms" podStartSLOduration=4.720039734 podStartE2EDuration="54.389839465s" podCreationTimestamp="2025-12-01 09:24:16 +0000 UTC" firstStartedPulling="2025-12-01 09:24:18.855672547 +0000 UTC m=+980.315059301" lastFinishedPulling="2025-12-01 09:25:08.525472278 +0000 UTC m=+1029.984859032" observedRunningTime="2025-12-01 09:25:10.388992493 +0000 UTC m=+1031.848379247" watchObservedRunningTime="2025-12-01 09:25:10.389839465 +0000 UTC m=+1031.849226219" Dec 01 09:25:11 crc kubenswrapper[4867]: I1201 09:25:11.386506 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-9nc4v" event={"ID":"0deeeac8-147f-441c-ba67-2e6e9bc32073","Type":"ContainerStarted","Data":"aa22c5601769eabc37250e1ce3c0bcf8bbd899870d114e45590d5fcd42526e0a"} Dec 01 09:25:11 crc kubenswrapper[4867]: I1201 09:25:11.387346 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-9nc4v" Dec 01 09:25:11 crc kubenswrapper[4867]: I1201 09:25:11.390308 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4wbsd" event={"ID":"c10410e7-47b2-4a48-bf7d-440a00afd4b4","Type":"ContainerStarted","Data":"5c8e27e017575b5742ac00fac9e5d3ade236c8b7f035bdf5334f80c7cbd727c1"} Dec 01 09:25:11 crc kubenswrapper[4867]: I1201 09:25:11.390357 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4wbsd" Dec 01 09:25:11 crc kubenswrapper[4867]: I1201 09:25:11.412907 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-9nc4v" podStartSLOduration=3.471155488 podStartE2EDuration="55.41287742s" podCreationTimestamp="2025-12-01 09:24:16 +0000 UTC" firstStartedPulling="2025-12-01 09:24:18.902149953 +0000 UTC m=+980.361536707" lastFinishedPulling="2025-12-01 09:25:10.843871865 +0000 UTC m=+1032.303258639" observedRunningTime="2025-12-01 09:25:11.408970136 +0000 UTC m=+1032.868356890" watchObservedRunningTime="2025-12-01 09:25:11.41287742 +0000 UTC m=+1032.872264174" Dec 01 09:25:11 crc kubenswrapper[4867]: I1201 09:25:11.456484 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4wbsd" podStartSLOduration=2.9140404330000003 podStartE2EDuration="55.45646825s" podCreationTimestamp="2025-12-01 09:24:16 +0000 UTC" firstStartedPulling="2025-12-01 09:24:18.365654981 +0000 UTC m=+979.825041745" lastFinishedPulling="2025-12-01 09:25:10.908082818 +0000 UTC m=+1032.367469562" observedRunningTime="2025-12-01 09:25:11.452491223 +0000 UTC m=+1032.911877977" watchObservedRunningTime="2025-12-01 09:25:11.45646825 +0000 UTC m=+1032.915855004" Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.395983 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" event={"ID":"4d73996e-90d0-44f5-85f9-3800f54fc3d7","Type":"ContainerStarted","Data":"f63aeed8b026ff39a2e5b2783d9a7fd1d4172bfd51c8182e54275ba010c34a59"} Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.398847 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-twg2p" event={"ID":"30c79a23-86f2-4a05-adde-41ada03e2e7e","Type":"ContainerStarted","Data":"d628110242c795b734dcd59eb98775723e35512e02bc149c9d4ec314bee81cc8"} Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.398918 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-twg2p" Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.401398 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sjmfh" event={"ID":"656a9362-30cf-43f6-9909-95859bef129e","Type":"ContainerStarted","Data":"09b03fc9e635712aabd4b44a48ece5ec02691f539ba5eac28434e655be13c79e"} Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.401584 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sjmfh" Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.404890 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-swrc5" event={"ID":"c900776b-c7ea-4e4d-9b6b-00245cf048ce","Type":"ContainerStarted","Data":"65081a55f34ec845f6ffdf0806b8e64a01a4402f0c8deac10d22ce10ff0f1f7e"} Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.405330 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-swrc5" Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.407297 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-24whr" event={"ID":"b5f9e64b-a7d0-4437-91ac-f84c2441cd8d","Type":"ContainerStarted","Data":"abafc9f6311784f7c0f8f7cacd4635d345d71d3322e88b153b7103a69e5af530"} Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.408625 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b4j75" event={"ID":"fd8d1846-f143-4ca0-88df-af3eca96175d","Type":"ContainerStarted","Data":"8b472d57e747c60181b5f858c4804e2b356b57f5f1e64863cb2697659937213f"} Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.409012 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b4j75" Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.417389 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-vktv2" event={"ID":"68e139fd-19f5-4033-93b8-4ebf8397b510","Type":"ContainerStarted","Data":"2c286028cc034e26015f85d4d6ea3005fcc355b95648a471199dbb5c137e457c"} Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.417543 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-vktv2" Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.420416 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zdllr" event={"ID":"0d369519-2f02-4efe-9deb-885362964597","Type":"ContainerStarted","Data":"9c735fda8eda89edf81b5f3d452073edbb70d593a273683654c32bf95e1220a3"} Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.420540 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zdllr" Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.423251 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-twg2p" podStartSLOduration=4.592462621 podStartE2EDuration="56.423240716s" podCreationTimestamp="2025-12-01 09:24:16 +0000 UTC" firstStartedPulling="2025-12-01 09:24:19.076309946 +0000 UTC m=+980.535696700" lastFinishedPulling="2025-12-01 09:25:10.907088041 +0000 UTC m=+1032.366474795" observedRunningTime="2025-12-01 09:25:12.422568117 +0000 UTC m=+1033.881954871" watchObservedRunningTime="2025-12-01 09:25:12.423240716 +0000 UTC m=+1033.882627460" Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.430451 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-hlksd" event={"ID":"a8956c5b-7421-4442-8d62-773a5fe02fd0","Type":"ContainerStarted","Data":"71f2a947696fc61a77565c6381e4e22d24e2fbadc3a0e6a92ff431269ae6ff87"} Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.430788 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-hlksd" Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.477355 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-vktv2" podStartSLOduration=4.5801873 podStartE2EDuration="56.477340847s" podCreationTimestamp="2025-12-01 09:24:16 +0000 UTC" firstStartedPulling="2025-12-01 09:24:19.015872124 +0000 UTC m=+980.475258878" lastFinishedPulling="2025-12-01 09:25:10.913025671 +0000 UTC m=+1032.372412425" observedRunningTime="2025-12-01 09:25:12.471973543 +0000 UTC m=+1033.931360297" watchObservedRunningTime="2025-12-01 09:25:12.477340847 +0000 UTC m=+1033.936727601" Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.549574 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zdllr" podStartSLOduration=4.869635436 podStartE2EDuration="56.549554905s" podCreationTimestamp="2025-12-01 09:24:16 +0000 UTC" firstStartedPulling="2025-12-01 09:24:19.016596303 +0000 UTC m=+980.475983057" lastFinishedPulling="2025-12-01 09:25:10.696515772 +0000 UTC m=+1032.155902526" observedRunningTime="2025-12-01 09:25:12.51321342 +0000 UTC m=+1033.972600174" watchObservedRunningTime="2025-12-01 09:25:12.549554905 +0000 UTC m=+1034.008941669" Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.553726 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sjmfh" podStartSLOduration=4.731948782 podStartE2EDuration="56.553708436s" podCreationTimestamp="2025-12-01 09:24:16 +0000 UTC" firstStartedPulling="2025-12-01 09:24:19.084020572 +0000 UTC m=+980.543407326" lastFinishedPulling="2025-12-01 09:25:10.905780226 +0000 UTC m=+1032.365166980" observedRunningTime="2025-12-01 09:25:12.540281985 +0000 UTC m=+1033.999668739" watchObservedRunningTime="2025-12-01 09:25:12.553708436 +0000 UTC m=+1034.013095200" Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.578677 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-swrc5" podStartSLOduration=3.556591148 podStartE2EDuration="55.578656965s" podCreationTimestamp="2025-12-01 09:24:17 +0000 UTC" firstStartedPulling="2025-12-01 09:24:19.085286637 +0000 UTC m=+980.544673391" lastFinishedPulling="2025-12-01 09:25:11.107352454 +0000 UTC m=+1032.566739208" observedRunningTime="2025-12-01 09:25:12.573223009 +0000 UTC m=+1034.032609763" watchObservedRunningTime="2025-12-01 09:25:12.578656965 +0000 UTC m=+1034.038043729" Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.623133 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b4j75" podStartSLOduration=4.73035975 podStartE2EDuration="56.623117498s" podCreationTimestamp="2025-12-01 09:24:16 +0000 UTC" firstStartedPulling="2025-12-01 09:24:19.015369531 +0000 UTC m=+980.474756285" lastFinishedPulling="2025-12-01 09:25:10.908127279 +0000 UTC m=+1032.367514033" observedRunningTime="2025-12-01 09:25:12.613711356 +0000 UTC m=+1034.073098110" watchObservedRunningTime="2025-12-01 09:25:12.623117498 +0000 UTC m=+1034.082504252" Dec 01 09:25:12 crc kubenswrapper[4867]: I1201 09:25:12.663473 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-hlksd" podStartSLOduration=4.785326764 podStartE2EDuration="56.66345484s" podCreationTimestamp="2025-12-01 09:24:16 +0000 UTC" firstStartedPulling="2025-12-01 09:24:18.965665377 +0000 UTC m=+980.425052121" lastFinishedPulling="2025-12-01 09:25:10.843793413 +0000 UTC m=+1032.303180197" observedRunningTime="2025-12-01 09:25:12.660390027 +0000 UTC m=+1034.119776781" watchObservedRunningTime="2025-12-01 09:25:12.66345484 +0000 UTC m=+1034.122841594" Dec 01 09:25:13 crc kubenswrapper[4867]: I1201 09:25:13.448025 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-24whr" event={"ID":"b5f9e64b-a7d0-4437-91ac-f84c2441cd8d","Type":"ContainerStarted","Data":"56ec30020796e878c3bee0716baf3d2e897c82d895e93a064cc28a7335e37ff9"} Dec 01 09:25:13 crc kubenswrapper[4867]: I1201 09:25:13.448528 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-24whr" Dec 01 09:25:13 crc kubenswrapper[4867]: I1201 09:25:13.454426 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" event={"ID":"4d73996e-90d0-44f5-85f9-3800f54fc3d7","Type":"ContainerStarted","Data":"6eb43477779d3abfd0b5cdf231f4ad572fdce446618a3f2dcee1551e06c67134"} Dec 01 09:25:13 crc kubenswrapper[4867]: I1201 09:25:13.472923 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-24whr" podStartSLOduration=52.797768204 podStartE2EDuration="57.472897245s" podCreationTimestamp="2025-12-01 09:24:16 +0000 UTC" firstStartedPulling="2025-12-01 09:25:06.232515755 +0000 UTC m=+1027.691902509" lastFinishedPulling="2025-12-01 09:25:10.907644796 +0000 UTC m=+1032.367031550" observedRunningTime="2025-12-01 09:25:13.46561981 +0000 UTC m=+1034.925006564" watchObservedRunningTime="2025-12-01 09:25:13.472897245 +0000 UTC m=+1034.932284009" Dec 01 09:25:13 crc kubenswrapper[4867]: I1201 09:25:13.502941 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" podStartSLOduration=53.479295598 podStartE2EDuration="57.502920951s" podCreationTimestamp="2025-12-01 09:24:16 +0000 UTC" firstStartedPulling="2025-12-01 09:25:06.820984262 +0000 UTC m=+1028.280371016" lastFinishedPulling="2025-12-01 09:25:10.844609605 +0000 UTC m=+1032.303996369" observedRunningTime="2025-12-01 09:25:13.498580875 +0000 UTC m=+1034.957967649" watchObservedRunningTime="2025-12-01 09:25:13.502920951 +0000 UTC m=+1034.962307705" Dec 01 09:25:13 crc kubenswrapper[4867]: I1201 09:25:13.928650 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-56cfc94774-wn77q" Dec 01 09:25:14 crc kubenswrapper[4867]: I1201 09:25:14.461960 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" Dec 01 09:25:16 crc kubenswrapper[4867]: I1201 09:25:16.478270 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-grrzp" event={"ID":"461764b0-73a3-4866-aec1-e687293591e3","Type":"ContainerStarted","Data":"8d08cb0c74129c9853d34601483bac113cb29f1e6e1fe95fbaa7828fe03c88fc"} Dec 01 09:25:16 crc kubenswrapper[4867]: I1201 09:25:16.478708 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-grrzp" event={"ID":"461764b0-73a3-4866-aec1-e687293591e3","Type":"ContainerStarted","Data":"159d3b00556cc198145914b8dd284776d555012909af7954e16a7356f5a15503"} Dec 01 09:25:16 crc kubenswrapper[4867]: I1201 09:25:16.479105 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-grrzp" Dec 01 09:25:16 crc kubenswrapper[4867]: I1201 09:25:16.609588 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-nrm56" Dec 01 09:25:16 crc kubenswrapper[4867]: I1201 09:25:16.623562 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4wbsd" Dec 01 09:25:16 crc kubenswrapper[4867]: I1201 09:25:16.637277 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-grrzp" podStartSLOduration=4.055991288 podStartE2EDuration="1m0.637256147s" podCreationTimestamp="2025-12-01 09:24:16 +0000 UTC" firstStartedPulling="2025-12-01 09:24:19.255100182 +0000 UTC m=+980.714486936" lastFinishedPulling="2025-12-01 09:25:15.836365041 +0000 UTC m=+1037.295751795" observedRunningTime="2025-12-01 09:25:16.506055977 +0000 UTC m=+1037.965442741" watchObservedRunningTime="2025-12-01 09:25:16.637256147 +0000 UTC m=+1038.096642911" Dec 01 09:25:16 crc kubenswrapper[4867]: I1201 09:25:16.667539 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-9nc4v" Dec 01 09:25:16 crc kubenswrapper[4867]: I1201 09:25:16.730273 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-vktv2" Dec 01 09:25:16 crc kubenswrapper[4867]: I1201 09:25:16.793252 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-p7rms" Dec 01 09:25:16 crc kubenswrapper[4867]: I1201 09:25:16.822394 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zdllr" Dec 01 09:25:16 crc kubenswrapper[4867]: E1201 09:25:16.829996 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-68x8r" podUID="bb2d9cc0-c5d4-4abe-874e-8ce801c6cbdf" Dec 01 09:25:17 crc kubenswrapper[4867]: I1201 09:25:17.007264 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b4j75" Dec 01 09:25:17 crc kubenswrapper[4867]: I1201 09:25:17.155355 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-hlksd" Dec 01 09:25:17 crc kubenswrapper[4867]: I1201 09:25:17.401658 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-77sbx" Dec 01 09:25:17 crc kubenswrapper[4867]: I1201 09:25:17.418401 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-twg2p" Dec 01 09:25:17 crc kubenswrapper[4867]: I1201 09:25:17.445398 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sjmfh" Dec 01 09:25:17 crc kubenswrapper[4867]: I1201 09:25:17.814887 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-swrc5" Dec 01 09:25:18 crc kubenswrapper[4867]: E1201 09:25:18.832441 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bhgk8" podUID="ffbd9e52-147b-42cd-abaa-ec7d1341b826" Dec 01 09:25:19 crc kubenswrapper[4867]: I1201 09:25:19.830047 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:25:22 crc kubenswrapper[4867]: I1201 09:25:22.485862 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-24whr" Dec 01 09:25:23 crc kubenswrapper[4867]: I1201 09:25:23.124568 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4" Dec 01 09:25:23 crc kubenswrapper[4867]: I1201 09:25:23.540587 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-l7jwc" event={"ID":"9be92a6c-afef-449e-927b-8d0732a2140a","Type":"ContainerStarted","Data":"a267cf90698b33d2210eed39587ab70285fffbb1aa5cc530e4319cf3f7cac21e"} Dec 01 09:25:23 crc kubenswrapper[4867]: I1201 09:25:23.541529 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-l7jwc" Dec 01 09:25:23 crc kubenswrapper[4867]: I1201 09:25:23.582754 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-l7jwc" podStartSLOduration=3.85068302 podStartE2EDuration="1m7.582730536s" podCreationTimestamp="2025-12-01 09:24:16 +0000 UTC" firstStartedPulling="2025-12-01 09:24:19.296515223 +0000 UTC m=+980.755901977" lastFinishedPulling="2025-12-01 09:25:23.028562739 +0000 UTC m=+1044.487949493" observedRunningTime="2025-12-01 09:25:23.570543159 +0000 UTC m=+1045.029929943" watchObservedRunningTime="2025-12-01 09:25:23.582730536 +0000 UTC m=+1045.042117290" Dec 01 09:25:24 crc kubenswrapper[4867]: I1201 09:25:24.564170 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-492tf" event={"ID":"07b4a3c9-b63d-4a6f-9227-e2cd767f9d9a","Type":"ContainerStarted","Data":"1a75b27d521fdcc6dd67f72fea9a8f413e66cf59d63f7f0745641f31a01d8b87"} Dec 01 09:25:24 crc kubenswrapper[4867]: I1201 09:25:24.564428 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-492tf" Dec 01 09:25:24 crc kubenswrapper[4867]: I1201 09:25:24.574282 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j698r" event={"ID":"f3176675-0a3a-4fd2-9727-349ec1b88de7","Type":"ContainerStarted","Data":"70c6213e0ae7bb6f9e5ccded3264b725210029238781cd71bf3991350e727ff2"} Dec 01 09:25:24 crc kubenswrapper[4867]: I1201 09:25:24.574366 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j698r" event={"ID":"f3176675-0a3a-4fd2-9727-349ec1b88de7","Type":"ContainerStarted","Data":"c24be97f5631ca62f555f3feafaa4213931537be21111a8127c7fc940fbc9034"} Dec 01 09:25:24 crc kubenswrapper[4867]: I1201 09:25:24.575623 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j698r" Dec 01 09:25:24 crc kubenswrapper[4867]: I1201 09:25:24.581539 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mxkvs" event={"ID":"e9fd074d-b9bc-4215-bfd7-56df604f101c","Type":"ContainerStarted","Data":"3cd32887c851576b987eb907a2a4c35e58de742efb270ce2227b9708dc96cb02"} Dec 01 09:25:24 crc kubenswrapper[4867]: I1201 09:25:24.581894 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mxkvs" Dec 01 09:25:24 crc kubenswrapper[4867]: I1201 09:25:24.590575 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-492tf" podStartSLOduration=4.59471519 podStartE2EDuration="1m8.590555933s" podCreationTimestamp="2025-12-01 09:24:16 +0000 UTC" firstStartedPulling="2025-12-01 09:24:19.063939064 +0000 UTC m=+980.523325818" lastFinishedPulling="2025-12-01 09:25:23.059779807 +0000 UTC m=+1044.519166561" observedRunningTime="2025-12-01 09:25:24.583517935 +0000 UTC m=+1046.042904699" watchObservedRunningTime="2025-12-01 09:25:24.590555933 +0000 UTC m=+1046.049942687" Dec 01 09:25:24 crc kubenswrapper[4867]: I1201 09:25:24.630903 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j698r" podStartSLOduration=4.670904565 podStartE2EDuration="1m8.630879825s" podCreationTimestamp="2025-12-01 09:24:16 +0000 UTC" firstStartedPulling="2025-12-01 09:24:19.295071045 +0000 UTC m=+980.754457799" lastFinishedPulling="2025-12-01 09:25:23.255046305 +0000 UTC m=+1044.714433059" observedRunningTime="2025-12-01 09:25:24.624766131 +0000 UTC m=+1046.084152875" watchObservedRunningTime="2025-12-01 09:25:24.630879825 +0000 UTC m=+1046.090266579" Dec 01 09:25:24 crc kubenswrapper[4867]: I1201 09:25:24.671112 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mxkvs" podStartSLOduration=4.728754247 podStartE2EDuration="1m8.671087024s" podCreationTimestamp="2025-12-01 09:24:16 +0000 UTC" firstStartedPulling="2025-12-01 09:24:19.088571245 +0000 UTC m=+980.547957999" lastFinishedPulling="2025-12-01 09:25:23.030904022 +0000 UTC m=+1044.490290776" observedRunningTime="2025-12-01 09:25:24.661306781 +0000 UTC m=+1046.120693535" watchObservedRunningTime="2025-12-01 09:25:24.671087024 +0000 UTC m=+1046.130473778" Dec 01 09:25:25 crc kubenswrapper[4867]: I1201 09:25:25.590267 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g2ddn" event={"ID":"573accdf-9cb1-4643-af86-744e695a1f9d","Type":"ContainerStarted","Data":"39a0fe7998c8ba7f9fec4c17a97d54d0b01e8706891f3347e2e716f7d2eebe5a"} Dec 01 09:25:25 crc kubenswrapper[4867]: I1201 09:25:25.591914 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g2ddn" Dec 01 09:25:25 crc kubenswrapper[4867]: I1201 09:25:25.623283 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g2ddn" podStartSLOduration=3.37068309 podStartE2EDuration="1m8.623262588s" podCreationTimestamp="2025-12-01 09:24:17 +0000 UTC" firstStartedPulling="2025-12-01 09:24:19.295192378 +0000 UTC m=+980.754579132" lastFinishedPulling="2025-12-01 09:25:24.547771876 +0000 UTC m=+1046.007158630" observedRunningTime="2025-12-01 09:25:25.622089066 +0000 UTC m=+1047.081475820" watchObservedRunningTime="2025-12-01 09:25:25.623262588 +0000 UTC m=+1047.082649362" Dec 01 09:25:27 crc kubenswrapper[4867]: I1201 09:25:27.876597 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-grrzp" Dec 01 09:25:29 crc kubenswrapper[4867]: I1201 09:25:29.619009 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-68x8r" event={"ID":"bb2d9cc0-c5d4-4abe-874e-8ce801c6cbdf","Type":"ContainerStarted","Data":"f6ec6eac5bb0e77b19482d0157b677bab2a56533f91db009b50d02b8bbfd8744"} Dec 01 09:25:29 crc kubenswrapper[4867]: I1201 09:25:29.657668 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-68x8r" podStartSLOduration=3.492564931 podStartE2EDuration="1m12.6576516s" podCreationTimestamp="2025-12-01 09:24:17 +0000 UTC" firstStartedPulling="2025-12-01 09:24:19.41863426 +0000 UTC m=+980.878021014" lastFinishedPulling="2025-12-01 09:25:28.583720929 +0000 UTC m=+1050.043107683" observedRunningTime="2025-12-01 09:25:29.650669482 +0000 UTC m=+1051.110056236" watchObservedRunningTime="2025-12-01 09:25:29.6576516 +0000 UTC m=+1051.117038354" Dec 01 09:25:35 crc kubenswrapper[4867]: I1201 09:25:35.670914 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bhgk8" event={"ID":"ffbd9e52-147b-42cd-abaa-ec7d1341b826","Type":"ContainerStarted","Data":"46878dddcbff44bad1049a27b13f35066dd6500b5e57e60e2f3f2feca8014adf"} Dec 01 09:25:35 crc kubenswrapper[4867]: I1201 09:25:35.671564 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bhgk8" Dec 01 09:25:35 crc kubenswrapper[4867]: I1201 09:25:35.691284 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bhgk8" podStartSLOduration=4.33525265 podStartE2EDuration="1m19.691263187s" podCreationTimestamp="2025-12-01 09:24:16 +0000 UTC" firstStartedPulling="2025-12-01 09:24:19.301214369 +0000 UTC m=+980.760601123" lastFinishedPulling="2025-12-01 09:25:34.657224916 +0000 UTC m=+1056.116611660" observedRunningTime="2025-12-01 09:25:35.69064517 +0000 UTC m=+1057.150031934" watchObservedRunningTime="2025-12-01 09:25:35.691263187 +0000 UTC m=+1057.150649941" Dec 01 09:25:37 crc kubenswrapper[4867]: I1201 09:25:37.055676 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-492tf" Dec 01 09:25:37 crc kubenswrapper[4867]: I1201 09:25:37.504181 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mxkvs" Dec 01 09:25:37 crc kubenswrapper[4867]: I1201 09:25:37.747033 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-l7jwc" Dec 01 09:25:37 crc kubenswrapper[4867]: I1201 09:25:37.986893 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j698r" Dec 01 09:25:37 crc kubenswrapper[4867]: I1201 09:25:37.994629 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g2ddn" Dec 01 09:25:47 crc kubenswrapper[4867]: I1201 09:25:47.972724 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-bhgk8" Dec 01 09:25:51 crc kubenswrapper[4867]: I1201 09:25:51.601329 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:25:51 crc kubenswrapper[4867]: I1201 09:25:51.601771 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.086010 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8sg62"] Dec 01 09:26:03 crc kubenswrapper[4867]: E1201 09:26:03.086773 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4415af65-5e2b-472d-b687-54fa137bf02e" containerName="extract-utilities" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.086786 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4415af65-5e2b-472d-b687-54fa137bf02e" containerName="extract-utilities" Dec 01 09:26:03 crc kubenswrapper[4867]: E1201 09:26:03.086856 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4415af65-5e2b-472d-b687-54fa137bf02e" containerName="registry-server" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.086862 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4415af65-5e2b-472d-b687-54fa137bf02e" containerName="registry-server" Dec 01 09:26:03 crc kubenswrapper[4867]: E1201 09:26:03.086873 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4415af65-5e2b-472d-b687-54fa137bf02e" containerName="extract-content" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.086879 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4415af65-5e2b-472d-b687-54fa137bf02e" containerName="extract-content" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.087021 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="4415af65-5e2b-472d-b687-54fa137bf02e" containerName="registry-server" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.087717 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8sg62" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.091734 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.093302 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.093572 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.093746 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-5xhvz" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.110031 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8sg62"] Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.148216 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f68c337-8402-4802-bd14-1590ae5d0a3e-config\") pod \"dnsmasq-dns-675f4bcbfc-8sg62\" (UID: \"5f68c337-8402-4802-bd14-1590ae5d0a3e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8sg62" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.148300 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrswf\" (UniqueName: \"kubernetes.io/projected/5f68c337-8402-4802-bd14-1590ae5d0a3e-kube-api-access-lrswf\") pod \"dnsmasq-dns-675f4bcbfc-8sg62\" (UID: \"5f68c337-8402-4802-bd14-1590ae5d0a3e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8sg62" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.193168 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jm5gj"] Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.194326 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jm5gj" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.203900 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.214823 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jm5gj"] Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.249159 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06e447f3-796d-4ff8-a960-8bb125768a59-config\") pod \"dnsmasq-dns-78dd6ddcc-jm5gj\" (UID: \"06e447f3-796d-4ff8-a960-8bb125768a59\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jm5gj" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.249217 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06e447f3-796d-4ff8-a960-8bb125768a59-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-jm5gj\" (UID: \"06e447f3-796d-4ff8-a960-8bb125768a59\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jm5gj" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.249253 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f68c337-8402-4802-bd14-1590ae5d0a3e-config\") pod \"dnsmasq-dns-675f4bcbfc-8sg62\" (UID: \"5f68c337-8402-4802-bd14-1590ae5d0a3e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8sg62" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.249504 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt4q6\" (UniqueName: \"kubernetes.io/projected/06e447f3-796d-4ff8-a960-8bb125768a59-kube-api-access-rt4q6\") pod \"dnsmasq-dns-78dd6ddcc-jm5gj\" (UID: \"06e447f3-796d-4ff8-a960-8bb125768a59\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jm5gj" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.249618 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrswf\" (UniqueName: \"kubernetes.io/projected/5f68c337-8402-4802-bd14-1590ae5d0a3e-kube-api-access-lrswf\") pod \"dnsmasq-dns-675f4bcbfc-8sg62\" (UID: \"5f68c337-8402-4802-bd14-1590ae5d0a3e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8sg62" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.250106 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f68c337-8402-4802-bd14-1590ae5d0a3e-config\") pod \"dnsmasq-dns-675f4bcbfc-8sg62\" (UID: \"5f68c337-8402-4802-bd14-1590ae5d0a3e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8sg62" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.282073 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrswf\" (UniqueName: \"kubernetes.io/projected/5f68c337-8402-4802-bd14-1590ae5d0a3e-kube-api-access-lrswf\") pod \"dnsmasq-dns-675f4bcbfc-8sg62\" (UID: \"5f68c337-8402-4802-bd14-1590ae5d0a3e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8sg62" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.350782 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06e447f3-796d-4ff8-a960-8bb125768a59-config\") pod \"dnsmasq-dns-78dd6ddcc-jm5gj\" (UID: \"06e447f3-796d-4ff8-a960-8bb125768a59\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jm5gj" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.351646 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06e447f3-796d-4ff8-a960-8bb125768a59-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-jm5gj\" (UID: \"06e447f3-796d-4ff8-a960-8bb125768a59\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jm5gj" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.351602 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06e447f3-796d-4ff8-a960-8bb125768a59-config\") pod \"dnsmasq-dns-78dd6ddcc-jm5gj\" (UID: \"06e447f3-796d-4ff8-a960-8bb125768a59\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jm5gj" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.352290 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06e447f3-796d-4ff8-a960-8bb125768a59-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-jm5gj\" (UID: \"06e447f3-796d-4ff8-a960-8bb125768a59\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jm5gj" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.351747 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt4q6\" (UniqueName: \"kubernetes.io/projected/06e447f3-796d-4ff8-a960-8bb125768a59-kube-api-access-rt4q6\") pod \"dnsmasq-dns-78dd6ddcc-jm5gj\" (UID: \"06e447f3-796d-4ff8-a960-8bb125768a59\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jm5gj" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.370668 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt4q6\" (UniqueName: \"kubernetes.io/projected/06e447f3-796d-4ff8-a960-8bb125768a59-kube-api-access-rt4q6\") pod \"dnsmasq-dns-78dd6ddcc-jm5gj\" (UID: \"06e447f3-796d-4ff8-a960-8bb125768a59\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jm5gj" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.403722 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8sg62" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.506382 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jm5gj" Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.773189 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jm5gj"] Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.873139 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8sg62"] Dec 01 09:26:03 crc kubenswrapper[4867]: W1201 09:26:03.873622 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f68c337_8402_4802_bd14_1590ae5d0a3e.slice/crio-43af29006f6d3c8cffdf61efa65cfb45d8cd4e690bfc83dc294195b9e7674836 WatchSource:0}: Error finding container 43af29006f6d3c8cffdf61efa65cfb45d8cd4e690bfc83dc294195b9e7674836: Status 404 returned error can't find the container with id 43af29006f6d3c8cffdf61efa65cfb45d8cd4e690bfc83dc294195b9e7674836 Dec 01 09:26:03 crc kubenswrapper[4867]: I1201 09:26:03.878600 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-jm5gj" event={"ID":"06e447f3-796d-4ff8-a960-8bb125768a59","Type":"ContainerStarted","Data":"2e9bb13043cd0eabfd465fe091443a80a5c0895318e44774ce4436163f2ba0d1"} Dec 01 09:26:04 crc kubenswrapper[4867]: I1201 09:26:04.888224 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-8sg62" event={"ID":"5f68c337-8402-4802-bd14-1590ae5d0a3e","Type":"ContainerStarted","Data":"43af29006f6d3c8cffdf61efa65cfb45d8cd4e690bfc83dc294195b9e7674836"} Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.131199 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8sg62"] Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.159220 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pmmrl"] Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.160481 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pmmrl" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.199846 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pmmrl"] Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.200137 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n82jt\" (UniqueName: \"kubernetes.io/projected/d918819a-b715-46bf-95bc-cef73e65ea8d-kube-api-access-n82jt\") pod \"dnsmasq-dns-666b6646f7-pmmrl\" (UID: \"d918819a-b715-46bf-95bc-cef73e65ea8d\") " pod="openstack/dnsmasq-dns-666b6646f7-pmmrl" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.200183 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d918819a-b715-46bf-95bc-cef73e65ea8d-config\") pod \"dnsmasq-dns-666b6646f7-pmmrl\" (UID: \"d918819a-b715-46bf-95bc-cef73e65ea8d\") " pod="openstack/dnsmasq-dns-666b6646f7-pmmrl" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.200242 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d918819a-b715-46bf-95bc-cef73e65ea8d-dns-svc\") pod \"dnsmasq-dns-666b6646f7-pmmrl\" (UID: \"d918819a-b715-46bf-95bc-cef73e65ea8d\") " pod="openstack/dnsmasq-dns-666b6646f7-pmmrl" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.305782 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d918819a-b715-46bf-95bc-cef73e65ea8d-dns-svc\") pod \"dnsmasq-dns-666b6646f7-pmmrl\" (UID: \"d918819a-b715-46bf-95bc-cef73e65ea8d\") " pod="openstack/dnsmasq-dns-666b6646f7-pmmrl" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.305899 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n82jt\" (UniqueName: \"kubernetes.io/projected/d918819a-b715-46bf-95bc-cef73e65ea8d-kube-api-access-n82jt\") pod \"dnsmasq-dns-666b6646f7-pmmrl\" (UID: \"d918819a-b715-46bf-95bc-cef73e65ea8d\") " pod="openstack/dnsmasq-dns-666b6646f7-pmmrl" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.305920 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d918819a-b715-46bf-95bc-cef73e65ea8d-config\") pod \"dnsmasq-dns-666b6646f7-pmmrl\" (UID: \"d918819a-b715-46bf-95bc-cef73e65ea8d\") " pod="openstack/dnsmasq-dns-666b6646f7-pmmrl" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.306738 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d918819a-b715-46bf-95bc-cef73e65ea8d-config\") pod \"dnsmasq-dns-666b6646f7-pmmrl\" (UID: \"d918819a-b715-46bf-95bc-cef73e65ea8d\") " pod="openstack/dnsmasq-dns-666b6646f7-pmmrl" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.307273 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d918819a-b715-46bf-95bc-cef73e65ea8d-dns-svc\") pod \"dnsmasq-dns-666b6646f7-pmmrl\" (UID: \"d918819a-b715-46bf-95bc-cef73e65ea8d\") " pod="openstack/dnsmasq-dns-666b6646f7-pmmrl" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.338069 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n82jt\" (UniqueName: \"kubernetes.io/projected/d918819a-b715-46bf-95bc-cef73e65ea8d-kube-api-access-n82jt\") pod \"dnsmasq-dns-666b6646f7-pmmrl\" (UID: \"d918819a-b715-46bf-95bc-cef73e65ea8d\") " pod="openstack/dnsmasq-dns-666b6646f7-pmmrl" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.486185 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pmmrl" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.582570 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jm5gj"] Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.617942 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-k5wvx"] Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.619196 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-k5wvx" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.627139 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-k5wvx"] Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.723707 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/957496c6-2ac3-42c5-981a-ea77d637bacd-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-k5wvx\" (UID: \"957496c6-2ac3-42c5-981a-ea77d637bacd\") " pod="openstack/dnsmasq-dns-57d769cc4f-k5wvx" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.724016 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/957496c6-2ac3-42c5-981a-ea77d637bacd-config\") pod \"dnsmasq-dns-57d769cc4f-k5wvx\" (UID: \"957496c6-2ac3-42c5-981a-ea77d637bacd\") " pod="openstack/dnsmasq-dns-57d769cc4f-k5wvx" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.724115 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-597h5\" (UniqueName: \"kubernetes.io/projected/957496c6-2ac3-42c5-981a-ea77d637bacd-kube-api-access-597h5\") pod \"dnsmasq-dns-57d769cc4f-k5wvx\" (UID: \"957496c6-2ac3-42c5-981a-ea77d637bacd\") " pod="openstack/dnsmasq-dns-57d769cc4f-k5wvx" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.826415 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/957496c6-2ac3-42c5-981a-ea77d637bacd-config\") pod \"dnsmasq-dns-57d769cc4f-k5wvx\" (UID: \"957496c6-2ac3-42c5-981a-ea77d637bacd\") " pod="openstack/dnsmasq-dns-57d769cc4f-k5wvx" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.826521 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-597h5\" (UniqueName: \"kubernetes.io/projected/957496c6-2ac3-42c5-981a-ea77d637bacd-kube-api-access-597h5\") pod \"dnsmasq-dns-57d769cc4f-k5wvx\" (UID: \"957496c6-2ac3-42c5-981a-ea77d637bacd\") " pod="openstack/dnsmasq-dns-57d769cc4f-k5wvx" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.826562 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/957496c6-2ac3-42c5-981a-ea77d637bacd-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-k5wvx\" (UID: \"957496c6-2ac3-42c5-981a-ea77d637bacd\") " pod="openstack/dnsmasq-dns-57d769cc4f-k5wvx" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.827874 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/957496c6-2ac3-42c5-981a-ea77d637bacd-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-k5wvx\" (UID: \"957496c6-2ac3-42c5-981a-ea77d637bacd\") " pod="openstack/dnsmasq-dns-57d769cc4f-k5wvx" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.828391 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/957496c6-2ac3-42c5-981a-ea77d637bacd-config\") pod \"dnsmasq-dns-57d769cc4f-k5wvx\" (UID: \"957496c6-2ac3-42c5-981a-ea77d637bacd\") " pod="openstack/dnsmasq-dns-57d769cc4f-k5wvx" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.875852 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-597h5\" (UniqueName: \"kubernetes.io/projected/957496c6-2ac3-42c5-981a-ea77d637bacd-kube-api-access-597h5\") pod \"dnsmasq-dns-57d769cc4f-k5wvx\" (UID: \"957496c6-2ac3-42c5-981a-ea77d637bacd\") " pod="openstack/dnsmasq-dns-57d769cc4f-k5wvx" Dec 01 09:26:06 crc kubenswrapper[4867]: I1201 09:26:06.973698 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-k5wvx" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.231222 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pmmrl"] Dec 01 09:26:07 crc kubenswrapper[4867]: W1201 09:26:07.239482 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd918819a_b715_46bf_95bc_cef73e65ea8d.slice/crio-1e17bd6835465fcae8ad132f95d2f7a0f7d58ccafc47366f0e48ea1a87ba0e69 WatchSource:0}: Error finding container 1e17bd6835465fcae8ad132f95d2f7a0f7d58ccafc47366f0e48ea1a87ba0e69: Status 404 returned error can't find the container with id 1e17bd6835465fcae8ad132f95d2f7a0f7d58ccafc47366f0e48ea1a87ba0e69 Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.383599 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.385604 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.389354 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.389736 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vjhz7" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.389946 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.390103 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.390261 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.390369 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.390420 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.412597 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.544580 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/63bff526-5063-4326-8b3c-0c580320be58-server-conf\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.544657 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.544693 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/63bff526-5063-4326-8b3c-0c580320be58-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.544715 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2kwf\" (UniqueName: \"kubernetes.io/projected/63bff526-5063-4326-8b3c-0c580320be58-kube-api-access-h2kwf\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.544745 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/63bff526-5063-4326-8b3c-0c580320be58-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.544773 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.544793 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/63bff526-5063-4326-8b3c-0c580320be58-pod-info\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.544836 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.544893 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.544914 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.544939 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63bff526-5063-4326-8b3c-0c580320be58-config-data\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.646744 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.646798 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/63bff526-5063-4326-8b3c-0c580320be58-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.646894 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2kwf\" (UniqueName: \"kubernetes.io/projected/63bff526-5063-4326-8b3c-0c580320be58-kube-api-access-h2kwf\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.646916 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/63bff526-5063-4326-8b3c-0c580320be58-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.646946 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.646968 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/63bff526-5063-4326-8b3c-0c580320be58-pod-info\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.647043 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.647145 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.647172 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.647267 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63bff526-5063-4326-8b3c-0c580320be58-config-data\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.647351 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/63bff526-5063-4326-8b3c-0c580320be58-server-conf\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.649647 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.650228 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.652369 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/63bff526-5063-4326-8b3c-0c580320be58-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.652623 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63bff526-5063-4326-8b3c-0c580320be58-config-data\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.652736 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/63bff526-5063-4326-8b3c-0c580320be58-server-conf\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.656531 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/63bff526-5063-4326-8b3c-0c580320be58-pod-info\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.658011 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.659569 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.660830 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.664746 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/63bff526-5063-4326-8b3c-0c580320be58-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.704066 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.715422 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2kwf\" (UniqueName: \"kubernetes.io/projected/63bff526-5063-4326-8b3c-0c580320be58-kube-api-access-h2kwf\") pod \"rabbitmq-server-0\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.719375 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.751638 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-k5wvx"] Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.775707 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.783371 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.786133 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.786321 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.786490 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.786622 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.787727 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-s6htk" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.790927 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.790995 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.792490 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.954649 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pmmrl" event={"ID":"d918819a-b715-46bf-95bc-cef73e65ea8d","Type":"ContainerStarted","Data":"1e17bd6835465fcae8ad132f95d2f7a0f7d58ccafc47366f0e48ea1a87ba0e69"} Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.955971 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-k5wvx" event={"ID":"957496c6-2ac3-42c5-981a-ea77d637bacd","Type":"ContainerStarted","Data":"9e60383a59482273a43a29ba6ac0c9f3b916c7205de72ec621aa2146074cfd30"} Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.957877 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.957905 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.957934 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.957962 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.957991 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.958010 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.958029 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.958060 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.958102 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vvxs\" (UniqueName: \"kubernetes.io/projected/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-kube-api-access-7vvxs\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.958136 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:07 crc kubenswrapper[4867]: I1201 09:26:07.958157 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.060379 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vvxs\" (UniqueName: \"kubernetes.io/projected/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-kube-api-access-7vvxs\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.060435 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.060462 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.060483 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.060500 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.060519 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.060538 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.061044 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.061358 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.064303 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.060564 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.064383 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.064411 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.064454 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.065039 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.066257 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.071336 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.078108 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.091095 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vvxs\" (UniqueName: \"kubernetes.io/projected/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-kube-api-access-7vvxs\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.094779 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.097251 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.097778 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.101447 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.130861 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.405840 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.779163 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 09:26:08 crc kubenswrapper[4867]: W1201 09:26:08.789116 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f260d89_a8a0_4d49_a34a_a36a06ef2eee.slice/crio-5599e1d50b5891175e084c006f1a4c0fe84add0d38a187320464286d0ae49fb0 WatchSource:0}: Error finding container 5599e1d50b5891175e084c006f1a4c0fe84add0d38a187320464286d0ae49fb0: Status 404 returned error can't find the container with id 5599e1d50b5891175e084c006f1a4c0fe84add0d38a187320464286d0ae49fb0 Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.960362 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.961932 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.963921 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-tb6ws" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.964179 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.964619 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.975183 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.978595 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.986950 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7f260d89-a8a0-4d49-a34a-a36a06ef2eee","Type":"ContainerStarted","Data":"5599e1d50b5891175e084c006f1a4c0fe84add0d38a187320464286d0ae49fb0"} Dec 01 09:26:08 crc kubenswrapper[4867]: I1201 09:26:08.997110 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63bff526-5063-4326-8b3c-0c580320be58","Type":"ContainerStarted","Data":"d071e4ff6b3e798f26750823e7e4e403d7551d813defb7d6a22c2dc336d1d4da"} Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.003248 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.081166 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wj4w\" (UniqueName: \"kubernetes.io/projected/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-kube-api-access-5wj4w\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.081223 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-config-data-default\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.081361 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.081421 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.081452 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.081473 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.081489 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-kolla-config\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.081522 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.183148 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.183200 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.183225 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-kolla-config\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.183617 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.183687 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wj4w\" (UniqueName: \"kubernetes.io/projected/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-kube-api-access-5wj4w\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.183709 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-config-data-default\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.183763 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.183805 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.186037 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.186639 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-config-data-default\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.187215 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.189248 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-kolla-config\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.202070 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.203161 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.203688 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wj4w\" (UniqueName: \"kubernetes.io/projected/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-kube-api-access-5wj4w\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.208576 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a36be7a-7b6d-443d-94c6-4b3bdff15ec8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.213993 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8\") " pod="openstack/openstack-galera-0" Dec 01 09:26:09 crc kubenswrapper[4867]: I1201 09:26:09.292757 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.014099 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.429770 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.431049 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.435441 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.435711 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-xvd4z" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.442917 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.443227 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.449291 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.521863 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/31106653-bdaa-49c3-b14c-8eb180b0b2c3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.521991 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/31106653-bdaa-49c3-b14c-8eb180b0b2c3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.522028 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31106653-bdaa-49c3-b14c-8eb180b0b2c3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.522317 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31106653-bdaa-49c3-b14c-8eb180b0b2c3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.522357 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31106653-bdaa-49c3-b14c-8eb180b0b2c3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.522463 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/31106653-bdaa-49c3-b14c-8eb180b0b2c3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.522513 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqr5j\" (UniqueName: \"kubernetes.io/projected/31106653-bdaa-49c3-b14c-8eb180b0b2c3-kube-api-access-sqr5j\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.522655 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.623859 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/31106653-bdaa-49c3-b14c-8eb180b0b2c3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.623903 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31106653-bdaa-49c3-b14c-8eb180b0b2c3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.623950 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31106653-bdaa-49c3-b14c-8eb180b0b2c3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.623971 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31106653-bdaa-49c3-b14c-8eb180b0b2c3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.624019 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/31106653-bdaa-49c3-b14c-8eb180b0b2c3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.624049 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqr5j\" (UniqueName: \"kubernetes.io/projected/31106653-bdaa-49c3-b14c-8eb180b0b2c3-kube-api-access-sqr5j\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.624087 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.624106 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/31106653-bdaa-49c3-b14c-8eb180b0b2c3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.625559 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/31106653-bdaa-49c3-b14c-8eb180b0b2c3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.625934 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.626174 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31106653-bdaa-49c3-b14c-8eb180b0b2c3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.627132 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/31106653-bdaa-49c3-b14c-8eb180b0b2c3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.627593 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31106653-bdaa-49c3-b14c-8eb180b0b2c3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.637589 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/31106653-bdaa-49c3-b14c-8eb180b0b2c3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.637664 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31106653-bdaa-49c3-b14c-8eb180b0b2c3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.674075 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.691030 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqr5j\" (UniqueName: \"kubernetes.io/projected/31106653-bdaa-49c3-b14c-8eb180b0b2c3-kube-api-access-sqr5j\") pod \"openstack-cell1-galera-0\" (UID: \"31106653-bdaa-49c3-b14c-8eb180b0b2c3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.726321 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.728037 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.730521 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-mwdrl" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.731484 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.731730 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.749267 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.777462 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.826088 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a737446-2c4b-44f1-b660-9e433c5eb2d1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2a737446-2c4b-44f1-b660-9e433c5eb2d1\") " pod="openstack/memcached-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.826395 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a737446-2c4b-44f1-b660-9e433c5eb2d1-config-data\") pod \"memcached-0\" (UID: \"2a737446-2c4b-44f1-b660-9e433c5eb2d1\") " pod="openstack/memcached-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.826549 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl7rf\" (UniqueName: \"kubernetes.io/projected/2a737446-2c4b-44f1-b660-9e433c5eb2d1-kube-api-access-gl7rf\") pod \"memcached-0\" (UID: \"2a737446-2c4b-44f1-b660-9e433c5eb2d1\") " pod="openstack/memcached-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.826626 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2a737446-2c4b-44f1-b660-9e433c5eb2d1-kolla-config\") pod \"memcached-0\" (UID: \"2a737446-2c4b-44f1-b660-9e433c5eb2d1\") " pod="openstack/memcached-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.826717 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a737446-2c4b-44f1-b660-9e433c5eb2d1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2a737446-2c4b-44f1-b660-9e433c5eb2d1\") " pod="openstack/memcached-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.932591 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl7rf\" (UniqueName: \"kubernetes.io/projected/2a737446-2c4b-44f1-b660-9e433c5eb2d1-kube-api-access-gl7rf\") pod \"memcached-0\" (UID: \"2a737446-2c4b-44f1-b660-9e433c5eb2d1\") " pod="openstack/memcached-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.932680 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2a737446-2c4b-44f1-b660-9e433c5eb2d1-kolla-config\") pod \"memcached-0\" (UID: \"2a737446-2c4b-44f1-b660-9e433c5eb2d1\") " pod="openstack/memcached-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.932731 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a737446-2c4b-44f1-b660-9e433c5eb2d1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2a737446-2c4b-44f1-b660-9e433c5eb2d1\") " pod="openstack/memcached-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.932775 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a737446-2c4b-44f1-b660-9e433c5eb2d1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2a737446-2c4b-44f1-b660-9e433c5eb2d1\") " pod="openstack/memcached-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.943356 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a737446-2c4b-44f1-b660-9e433c5eb2d1-config-data\") pod \"memcached-0\" (UID: \"2a737446-2c4b-44f1-b660-9e433c5eb2d1\") " pod="openstack/memcached-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.944258 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a737446-2c4b-44f1-b660-9e433c5eb2d1-config-data\") pod \"memcached-0\" (UID: \"2a737446-2c4b-44f1-b660-9e433c5eb2d1\") " pod="openstack/memcached-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.953221 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a737446-2c4b-44f1-b660-9e433c5eb2d1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2a737446-2c4b-44f1-b660-9e433c5eb2d1\") " pod="openstack/memcached-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.961917 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a737446-2c4b-44f1-b660-9e433c5eb2d1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2a737446-2c4b-44f1-b660-9e433c5eb2d1\") " pod="openstack/memcached-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.967013 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2a737446-2c4b-44f1-b660-9e433c5eb2d1-kolla-config\") pod \"memcached-0\" (UID: \"2a737446-2c4b-44f1-b660-9e433c5eb2d1\") " pod="openstack/memcached-0" Dec 01 09:26:10 crc kubenswrapper[4867]: I1201 09:26:10.994608 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl7rf\" (UniqueName: \"kubernetes.io/projected/2a737446-2c4b-44f1-b660-9e433c5eb2d1-kube-api-access-gl7rf\") pod \"memcached-0\" (UID: \"2a737446-2c4b-44f1-b660-9e433c5eb2d1\") " pod="openstack/memcached-0" Dec 01 09:26:11 crc kubenswrapper[4867]: I1201 09:26:11.073047 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 09:26:11 crc kubenswrapper[4867]: I1201 09:26:11.163202 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8","Type":"ContainerStarted","Data":"df7569143598bd8789f6f6551b54727ec3ee8af78641472fae945bc661b633a9"} Dec 01 09:26:11 crc kubenswrapper[4867]: I1201 09:26:11.690406 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 09:26:11 crc kubenswrapper[4867]: W1201 09:26:11.700691 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31106653_bdaa_49c3_b14c_8eb180b0b2c3.slice/crio-f710cd42639407a09745882a784503e3899941710b2c159f7e0c39d8349cac8e WatchSource:0}: Error finding container f710cd42639407a09745882a784503e3899941710b2c159f7e0c39d8349cac8e: Status 404 returned error can't find the container with id f710cd42639407a09745882a784503e3899941710b2c159f7e0c39d8349cac8e Dec 01 09:26:11 crc kubenswrapper[4867]: I1201 09:26:11.840952 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 01 09:26:12 crc kubenswrapper[4867]: I1201 09:26:12.198931 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"31106653-bdaa-49c3-b14c-8eb180b0b2c3","Type":"ContainerStarted","Data":"f710cd42639407a09745882a784503e3899941710b2c159f7e0c39d8349cac8e"} Dec 01 09:26:12 crc kubenswrapper[4867]: I1201 09:26:12.638242 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 09:26:12 crc kubenswrapper[4867]: I1201 09:26:12.639443 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 09:26:12 crc kubenswrapper[4867]: I1201 09:26:12.648367 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-nkvgm" Dec 01 09:26:12 crc kubenswrapper[4867]: I1201 09:26:12.648637 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 09:26:12 crc kubenswrapper[4867]: I1201 09:26:12.795796 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxpzc\" (UniqueName: \"kubernetes.io/projected/1847ceb6-9b89-4f58-8bc0-d70d28fe4890-kube-api-access-wxpzc\") pod \"kube-state-metrics-0\" (UID: \"1847ceb6-9b89-4f58-8bc0-d70d28fe4890\") " pod="openstack/kube-state-metrics-0" Dec 01 09:26:12 crc kubenswrapper[4867]: I1201 09:26:12.898639 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxpzc\" (UniqueName: \"kubernetes.io/projected/1847ceb6-9b89-4f58-8bc0-d70d28fe4890-kube-api-access-wxpzc\") pod \"kube-state-metrics-0\" (UID: \"1847ceb6-9b89-4f58-8bc0-d70d28fe4890\") " pod="openstack/kube-state-metrics-0" Dec 01 09:26:12 crc kubenswrapper[4867]: I1201 09:26:12.936700 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxpzc\" (UniqueName: \"kubernetes.io/projected/1847ceb6-9b89-4f58-8bc0-d70d28fe4890-kube-api-access-wxpzc\") pod \"kube-state-metrics-0\" (UID: \"1847ceb6-9b89-4f58-8bc0-d70d28fe4890\") " pod="openstack/kube-state-metrics-0" Dec 01 09:26:12 crc kubenswrapper[4867]: I1201 09:26:12.988718 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.427985 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.429537 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.450276 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.450319 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.450276 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.450479 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.450985 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-prfc7" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.469069 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.569256 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-config\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.569713 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.570072 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.570264 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q44nk\" (UniqueName: \"kubernetes.io/projected/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-kube-api-access-q44nk\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.570296 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.570314 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.570480 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.570529 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.671521 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.671578 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.671605 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.671637 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.672073 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.672593 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-config\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.672619 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.672659 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.672720 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q44nk\" (UniqueName: \"kubernetes.io/projected/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-kube-api-access-q44nk\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.672437 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.674655 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-config\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.675807 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.681962 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.691442 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.710601 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q44nk\" (UniqueName: \"kubernetes.io/projected/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-kube-api-access-q44nk\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.714089 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.714175 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:15 crc kubenswrapper[4867]: I1201 09:26:15.791430 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:17 crc kubenswrapper[4867]: I1201 09:26:17.793786 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vmh2x"] Dec 01 09:26:17 crc kubenswrapper[4867]: I1201 09:26:17.795395 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:17 crc kubenswrapper[4867]: I1201 09:26:17.802955 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-4wqqv" Dec 01 09:26:17 crc kubenswrapper[4867]: I1201 09:26:17.803039 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 01 09:26:17 crc kubenswrapper[4867]: I1201 09:26:17.803703 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 01 09:26:17 crc kubenswrapper[4867]: I1201 09:26:17.848683 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vmh2x"] Dec 01 09:26:17 crc kubenswrapper[4867]: I1201 09:26:17.886149 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-9jsgc"] Dec 01 09:26:17 crc kubenswrapper[4867]: I1201 09:26:17.887910 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:17 crc kubenswrapper[4867]: I1201 09:26:17.912875 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9jsgc"] Dec 01 09:26:17 crc kubenswrapper[4867]: I1201 09:26:17.918607 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-var-log-ovn\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:17 crc kubenswrapper[4867]: I1201 09:26:17.918682 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nc72\" (UniqueName: \"kubernetes.io/projected/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-kube-api-access-8nc72\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:17 crc kubenswrapper[4867]: I1201 09:26:17.918752 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-ovn-controller-tls-certs\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:17 crc kubenswrapper[4867]: I1201 09:26:17.918887 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-scripts\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:17 crc kubenswrapper[4867]: I1201 09:26:17.918992 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-var-run\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:17 crc kubenswrapper[4867]: I1201 09:26:17.919103 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-var-run-ovn\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:17 crc kubenswrapper[4867]: I1201 09:26:17.919133 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-combined-ca-bundle\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.020113 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-var-log-ovn\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.020168 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzkct\" (UniqueName: \"kubernetes.io/projected/91f709b6-1fa8-40fb-80a0-45ea9510b009-kube-api-access-gzkct\") pod \"ovn-controller-ovs-9jsgc\" (UID: \"91f709b6-1fa8-40fb-80a0-45ea9510b009\") " pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.020195 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nc72\" (UniqueName: \"kubernetes.io/projected/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-kube-api-access-8nc72\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.020230 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/91f709b6-1fa8-40fb-80a0-45ea9510b009-etc-ovs\") pod \"ovn-controller-ovs-9jsgc\" (UID: \"91f709b6-1fa8-40fb-80a0-45ea9510b009\") " pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.020263 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-ovn-controller-tls-certs\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.020294 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91f709b6-1fa8-40fb-80a0-45ea9510b009-scripts\") pod \"ovn-controller-ovs-9jsgc\" (UID: \"91f709b6-1fa8-40fb-80a0-45ea9510b009\") " pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.020330 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-scripts\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.020355 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91f709b6-1fa8-40fb-80a0-45ea9510b009-var-run\") pod \"ovn-controller-ovs-9jsgc\" (UID: \"91f709b6-1fa8-40fb-80a0-45ea9510b009\") " pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.020389 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-var-run\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.020405 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/91f709b6-1fa8-40fb-80a0-45ea9510b009-var-log\") pod \"ovn-controller-ovs-9jsgc\" (UID: \"91f709b6-1fa8-40fb-80a0-45ea9510b009\") " pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.020438 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/91f709b6-1fa8-40fb-80a0-45ea9510b009-var-lib\") pod \"ovn-controller-ovs-9jsgc\" (UID: \"91f709b6-1fa8-40fb-80a0-45ea9510b009\") " pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.020458 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-var-run-ovn\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.020475 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-combined-ca-bundle\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.021325 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-var-run\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.022007 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-var-run-ovn\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.023220 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-scripts\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.024590 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-var-log-ovn\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.025938 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-ovn-controller-tls-certs\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.041270 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-combined-ca-bundle\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.050661 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nc72\" (UniqueName: \"kubernetes.io/projected/aa810b5f-4cad-40cc-9feb-6afc38b56ab1-kube-api-access-8nc72\") pod \"ovn-controller-vmh2x\" (UID: \"aa810b5f-4cad-40cc-9feb-6afc38b56ab1\") " pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.122361 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzkct\" (UniqueName: \"kubernetes.io/projected/91f709b6-1fa8-40fb-80a0-45ea9510b009-kube-api-access-gzkct\") pod \"ovn-controller-ovs-9jsgc\" (UID: \"91f709b6-1fa8-40fb-80a0-45ea9510b009\") " pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.122439 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/91f709b6-1fa8-40fb-80a0-45ea9510b009-etc-ovs\") pod \"ovn-controller-ovs-9jsgc\" (UID: \"91f709b6-1fa8-40fb-80a0-45ea9510b009\") " pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.122482 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91f709b6-1fa8-40fb-80a0-45ea9510b009-scripts\") pod \"ovn-controller-ovs-9jsgc\" (UID: \"91f709b6-1fa8-40fb-80a0-45ea9510b009\") " pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.122518 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91f709b6-1fa8-40fb-80a0-45ea9510b009-var-run\") pod \"ovn-controller-ovs-9jsgc\" (UID: \"91f709b6-1fa8-40fb-80a0-45ea9510b009\") " pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.122547 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/91f709b6-1fa8-40fb-80a0-45ea9510b009-var-log\") pod \"ovn-controller-ovs-9jsgc\" (UID: \"91f709b6-1fa8-40fb-80a0-45ea9510b009\") " pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.122577 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/91f709b6-1fa8-40fb-80a0-45ea9510b009-var-lib\") pod \"ovn-controller-ovs-9jsgc\" (UID: \"91f709b6-1fa8-40fb-80a0-45ea9510b009\") " pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.123014 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/91f709b6-1fa8-40fb-80a0-45ea9510b009-var-lib\") pod \"ovn-controller-ovs-9jsgc\" (UID: \"91f709b6-1fa8-40fb-80a0-45ea9510b009\") " pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.123414 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91f709b6-1fa8-40fb-80a0-45ea9510b009-var-run\") pod \"ovn-controller-ovs-9jsgc\" (UID: \"91f709b6-1fa8-40fb-80a0-45ea9510b009\") " pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.123586 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/91f709b6-1fa8-40fb-80a0-45ea9510b009-var-log\") pod \"ovn-controller-ovs-9jsgc\" (UID: \"91f709b6-1fa8-40fb-80a0-45ea9510b009\") " pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.123738 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/91f709b6-1fa8-40fb-80a0-45ea9510b009-etc-ovs\") pod \"ovn-controller-ovs-9jsgc\" (UID: \"91f709b6-1fa8-40fb-80a0-45ea9510b009\") " pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.131593 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91f709b6-1fa8-40fb-80a0-45ea9510b009-scripts\") pod \"ovn-controller-ovs-9jsgc\" (UID: \"91f709b6-1fa8-40fb-80a0-45ea9510b009\") " pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.159930 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzkct\" (UniqueName: \"kubernetes.io/projected/91f709b6-1fa8-40fb-80a0-45ea9510b009-kube-api-access-gzkct\") pod \"ovn-controller-ovs-9jsgc\" (UID: \"91f709b6-1fa8-40fb-80a0-45ea9510b009\") " pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.163225 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:18 crc kubenswrapper[4867]: I1201 09:26:18.207857 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:19 crc kubenswrapper[4867]: I1201 09:26:19.820929 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 09:26:19 crc kubenswrapper[4867]: I1201 09:26:19.822394 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:19 crc kubenswrapper[4867]: I1201 09:26:19.825793 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 01 09:26:19 crc kubenswrapper[4867]: I1201 09:26:19.826006 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 01 09:26:19 crc kubenswrapper[4867]: I1201 09:26:19.826236 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 01 09:26:19 crc kubenswrapper[4867]: I1201 09:26:19.826340 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-t8m7w" Dec 01 09:26:19 crc kubenswrapper[4867]: I1201 09:26:19.852159 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 09:26:19 crc kubenswrapper[4867]: I1201 09:26:19.997364 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1081877-3550-4ad4-9a89-a5cddfc4ba31-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:19 crc kubenswrapper[4867]: I1201 09:26:19.997443 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:19 crc kubenswrapper[4867]: I1201 09:26:19.997483 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1081877-3550-4ad4-9a89-a5cddfc4ba31-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:19 crc kubenswrapper[4867]: I1201 09:26:19.997543 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b1081877-3550-4ad4-9a89-a5cddfc4ba31-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:19 crc kubenswrapper[4867]: I1201 09:26:19.997569 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1081877-3550-4ad4-9a89-a5cddfc4ba31-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:19 crc kubenswrapper[4867]: I1201 09:26:19.997599 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1081877-3550-4ad4-9a89-a5cddfc4ba31-config\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:19 crc kubenswrapper[4867]: I1201 09:26:19.997621 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbpkl\" (UniqueName: \"kubernetes.io/projected/b1081877-3550-4ad4-9a89-a5cddfc4ba31-kube-api-access-tbpkl\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:19 crc kubenswrapper[4867]: I1201 09:26:19.997646 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1081877-3550-4ad4-9a89-a5cddfc4ba31-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:20 crc kubenswrapper[4867]: I1201 09:26:20.099384 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1081877-3550-4ad4-9a89-a5cddfc4ba31-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:20 crc kubenswrapper[4867]: I1201 09:26:20.099450 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:20 crc kubenswrapper[4867]: I1201 09:26:20.099470 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1081877-3550-4ad4-9a89-a5cddfc4ba31-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:20 crc kubenswrapper[4867]: I1201 09:26:20.099494 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b1081877-3550-4ad4-9a89-a5cddfc4ba31-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:20 crc kubenswrapper[4867]: I1201 09:26:20.099515 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1081877-3550-4ad4-9a89-a5cddfc4ba31-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:20 crc kubenswrapper[4867]: I1201 09:26:20.099537 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1081877-3550-4ad4-9a89-a5cddfc4ba31-config\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:20 crc kubenswrapper[4867]: I1201 09:26:20.099553 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbpkl\" (UniqueName: \"kubernetes.io/projected/b1081877-3550-4ad4-9a89-a5cddfc4ba31-kube-api-access-tbpkl\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:20 crc kubenswrapper[4867]: I1201 09:26:20.099568 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1081877-3550-4ad4-9a89-a5cddfc4ba31-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:20 crc kubenswrapper[4867]: I1201 09:26:20.100852 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b1081877-3550-4ad4-9a89-a5cddfc4ba31-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:20 crc kubenswrapper[4867]: I1201 09:26:20.102129 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1081877-3550-4ad4-9a89-a5cddfc4ba31-config\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:20 crc kubenswrapper[4867]: I1201 09:26:20.102711 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1081877-3550-4ad4-9a89-a5cddfc4ba31-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:20 crc kubenswrapper[4867]: I1201 09:26:20.102743 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:20 crc kubenswrapper[4867]: I1201 09:26:20.106270 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1081877-3550-4ad4-9a89-a5cddfc4ba31-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:20 crc kubenswrapper[4867]: I1201 09:26:20.106274 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1081877-3550-4ad4-9a89-a5cddfc4ba31-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:20 crc kubenswrapper[4867]: I1201 09:26:20.106576 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1081877-3550-4ad4-9a89-a5cddfc4ba31-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:20 crc kubenswrapper[4867]: I1201 09:26:20.124946 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbpkl\" (UniqueName: \"kubernetes.io/projected/b1081877-3550-4ad4-9a89-a5cddfc4ba31-kube-api-access-tbpkl\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:20 crc kubenswrapper[4867]: I1201 09:26:20.138390 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b1081877-3550-4ad4-9a89-a5cddfc4ba31\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:20 crc kubenswrapper[4867]: I1201 09:26:20.155985 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:21 crc kubenswrapper[4867]: I1201 09:26:21.316255 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"2a737446-2c4b-44f1-b660-9e433c5eb2d1","Type":"ContainerStarted","Data":"4c212d8406f9182d3b184465707f540c8761df12baaecb3f3c158d377baf4a26"} Dec 01 09:26:21 crc kubenswrapper[4867]: I1201 09:26:21.601759 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:26:21 crc kubenswrapper[4867]: I1201 09:26:21.601862 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:26:31 crc kubenswrapper[4867]: E1201 09:26:31.379114 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 01 09:26:31 crc kubenswrapper[4867]: E1201 09:26:31.379786 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7vvxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(7f260d89-a8a0-4d49-a34a-a36a06ef2eee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:26:31 crc kubenswrapper[4867]: E1201 09:26:31.380338 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 01 09:26:31 crc kubenswrapper[4867]: E1201 09:26:31.380526 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h2kwf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(63bff526-5063-4326-8b3c-0c580320be58): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:26:31 crc kubenswrapper[4867]: E1201 09:26:31.381603 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="7f260d89-a8a0-4d49-a34a-a36a06ef2eee" Dec 01 09:26:31 crc kubenswrapper[4867]: E1201 09:26:31.381687 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="63bff526-5063-4326-8b3c-0c580320be58" Dec 01 09:26:32 crc kubenswrapper[4867]: E1201 09:26:32.400771 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="7f260d89-a8a0-4d49-a34a-a36a06ef2eee" Dec 01 09:26:32 crc kubenswrapper[4867]: E1201 09:26:32.402624 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="63bff526-5063-4326-8b3c-0c580320be58" Dec 01 09:26:33 crc kubenswrapper[4867]: E1201 09:26:33.638789 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 01 09:26:33 crc kubenswrapper[4867]: E1201 09:26:33.639193 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5wj4w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(7a36be7a-7b6d-443d-94c6-4b3bdff15ec8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:26:33 crc kubenswrapper[4867]: E1201 09:26:33.640585 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="7a36be7a-7b6d-443d-94c6-4b3bdff15ec8" Dec 01 09:26:34 crc kubenswrapper[4867]: E1201 09:26:34.415964 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="7a36be7a-7b6d-443d-94c6-4b3bdff15ec8" Dec 01 09:26:39 crc kubenswrapper[4867]: E1201 09:26:39.854211 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 09:26:39 crc kubenswrapper[4867]: E1201 09:26:39.854936 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-597h5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-k5wvx_openstack(957496c6-2ac3-42c5-981a-ea77d637bacd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:26:39 crc kubenswrapper[4867]: E1201 09:26:39.856487 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-k5wvx" podUID="957496c6-2ac3-42c5-981a-ea77d637bacd" Dec 01 09:26:39 crc kubenswrapper[4867]: E1201 09:26:39.905151 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 09:26:39 crc kubenswrapper[4867]: E1201 09:26:39.905287 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rt4q6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-jm5gj_openstack(06e447f3-796d-4ff8-a960-8bb125768a59): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:26:39 crc kubenswrapper[4867]: E1201 09:26:39.909933 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 09:26:39 crc kubenswrapper[4867]: E1201 09:26:39.910324 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n82jt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-pmmrl_openstack(d918819a-b715-46bf-95bc-cef73e65ea8d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:26:39 crc kubenswrapper[4867]: E1201 09:26:39.911053 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-jm5gj" podUID="06e447f3-796d-4ff8-a960-8bb125768a59" Dec 01 09:26:39 crc kubenswrapper[4867]: E1201 09:26:39.911598 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-pmmrl" podUID="d918819a-b715-46bf-95bc-cef73e65ea8d" Dec 01 09:26:39 crc kubenswrapper[4867]: E1201 09:26:39.981658 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 09:26:39 crc kubenswrapper[4867]: E1201 09:26:39.981843 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lrswf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-8sg62_openstack(5f68c337-8402-4802-bd14-1590ae5d0a3e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:26:39 crc kubenswrapper[4867]: E1201 09:26:39.983789 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-8sg62" podUID="5f68c337-8402-4802-bd14-1590ae5d0a3e" Dec 01 09:26:40 crc kubenswrapper[4867]: I1201 09:26:40.430862 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 09:26:40 crc kubenswrapper[4867]: W1201 09:26:40.433718 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1847ceb6_9b89_4f58_8bc0_d70d28fe4890.slice/crio-9f23bc4842e0d1072afb42ce22297ba71e3977a87dca7bc359ad5327296b443b WatchSource:0}: Error finding container 9f23bc4842e0d1072afb42ce22297ba71e3977a87dca7bc359ad5327296b443b: Status 404 returned error can't find the container with id 9f23bc4842e0d1072afb42ce22297ba71e3977a87dca7bc359ad5327296b443b Dec 01 09:26:40 crc kubenswrapper[4867]: I1201 09:26:40.463305 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1847ceb6-9b89-4f58-8bc0-d70d28fe4890","Type":"ContainerStarted","Data":"9f23bc4842e0d1072afb42ce22297ba71e3977a87dca7bc359ad5327296b443b"} Dec 01 09:26:40 crc kubenswrapper[4867]: I1201 09:26:40.464901 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"31106653-bdaa-49c3-b14c-8eb180b0b2c3","Type":"ContainerStarted","Data":"ff2daa296402ee7069e468953ab2d4b70d226c94b4affecd173199f365b6e1e7"} Dec 01 09:26:40 crc kubenswrapper[4867]: I1201 09:26:40.467305 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"2a737446-2c4b-44f1-b660-9e433c5eb2d1","Type":"ContainerStarted","Data":"bbc347cd842752d7bf75c80a42e851ddabb77564c2d76a25e9755538a8a65ac7"} Dec 01 09:26:40 crc kubenswrapper[4867]: E1201 09:26:40.468533 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-k5wvx" podUID="957496c6-2ac3-42c5-981a-ea77d637bacd" Dec 01 09:26:40 crc kubenswrapper[4867]: E1201 09:26:40.469364 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-pmmrl" podUID="d918819a-b715-46bf-95bc-cef73e65ea8d" Dec 01 09:26:40 crc kubenswrapper[4867]: I1201 09:26:40.520250 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=11.619120773 podStartE2EDuration="30.520234393s" podCreationTimestamp="2025-12-01 09:26:10 +0000 UTC" firstStartedPulling="2025-12-01 09:26:21.004248077 +0000 UTC m=+1102.463634831" lastFinishedPulling="2025-12-01 09:26:39.905361697 +0000 UTC m=+1121.364748451" observedRunningTime="2025-12-01 09:26:40.516910162 +0000 UTC m=+1121.976296916" watchObservedRunningTime="2025-12-01 09:26:40.520234393 +0000 UTC m=+1121.979621147" Dec 01 09:26:40 crc kubenswrapper[4867]: I1201 09:26:40.530423 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vmh2x"] Dec 01 09:26:40 crc kubenswrapper[4867]: W1201 09:26:40.531081 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa810b5f_4cad_40cc_9feb_6afc38b56ab1.slice/crio-27d1cad1fc36721b4879bd1dc1f0f2fb1a5e6ee7d3c938cdbe74349c1a6dd1b2 WatchSource:0}: Error finding container 27d1cad1fc36721b4879bd1dc1f0f2fb1a5e6ee7d3c938cdbe74349c1a6dd1b2: Status 404 returned error can't find the container with id 27d1cad1fc36721b4879bd1dc1f0f2fb1a5e6ee7d3c938cdbe74349c1a6dd1b2 Dec 01 09:26:40 crc kubenswrapper[4867]: I1201 09:26:40.718089 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8sg62" Dec 01 09:26:40 crc kubenswrapper[4867]: I1201 09:26:40.788203 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrswf\" (UniqueName: \"kubernetes.io/projected/5f68c337-8402-4802-bd14-1590ae5d0a3e-kube-api-access-lrswf\") pod \"5f68c337-8402-4802-bd14-1590ae5d0a3e\" (UID: \"5f68c337-8402-4802-bd14-1590ae5d0a3e\") " Dec 01 09:26:40 crc kubenswrapper[4867]: I1201 09:26:40.788358 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f68c337-8402-4802-bd14-1590ae5d0a3e-config\") pod \"5f68c337-8402-4802-bd14-1590ae5d0a3e\" (UID: \"5f68c337-8402-4802-bd14-1590ae5d0a3e\") " Dec 01 09:26:40 crc kubenswrapper[4867]: I1201 09:26:40.788868 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f68c337-8402-4802-bd14-1590ae5d0a3e-config" (OuterVolumeSpecName: "config") pod "5f68c337-8402-4802-bd14-1590ae5d0a3e" (UID: "5f68c337-8402-4802-bd14-1590ae5d0a3e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:26:40 crc kubenswrapper[4867]: I1201 09:26:40.799102 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f68c337-8402-4802-bd14-1590ae5d0a3e-kube-api-access-lrswf" (OuterVolumeSpecName: "kube-api-access-lrswf") pod "5f68c337-8402-4802-bd14-1590ae5d0a3e" (UID: "5f68c337-8402-4802-bd14-1590ae5d0a3e"). InnerVolumeSpecName "kube-api-access-lrswf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:26:40 crc kubenswrapper[4867]: I1201 09:26:40.891235 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrswf\" (UniqueName: \"kubernetes.io/projected/5f68c337-8402-4802-bd14-1590ae5d0a3e-kube-api-access-lrswf\") on node \"crc\" DevicePath \"\"" Dec 01 09:26:40 crc kubenswrapper[4867]: I1201 09:26:40.891273 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f68c337-8402-4802-bd14-1590ae5d0a3e-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:26:40 crc kubenswrapper[4867]: I1201 09:26:40.948267 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.021177 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jm5gj" Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.074228 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.093906 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt4q6\" (UniqueName: \"kubernetes.io/projected/06e447f3-796d-4ff8-a960-8bb125768a59-kube-api-access-rt4q6\") pod \"06e447f3-796d-4ff8-a960-8bb125768a59\" (UID: \"06e447f3-796d-4ff8-a960-8bb125768a59\") " Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.093969 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06e447f3-796d-4ff8-a960-8bb125768a59-config\") pod \"06e447f3-796d-4ff8-a960-8bb125768a59\" (UID: \"06e447f3-796d-4ff8-a960-8bb125768a59\") " Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.094001 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06e447f3-796d-4ff8-a960-8bb125768a59-dns-svc\") pod \"06e447f3-796d-4ff8-a960-8bb125768a59\" (UID: \"06e447f3-796d-4ff8-a960-8bb125768a59\") " Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.094407 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06e447f3-796d-4ff8-a960-8bb125768a59-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "06e447f3-796d-4ff8-a960-8bb125768a59" (UID: "06e447f3-796d-4ff8-a960-8bb125768a59"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.094519 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06e447f3-796d-4ff8-a960-8bb125768a59-config" (OuterVolumeSpecName: "config") pod "06e447f3-796d-4ff8-a960-8bb125768a59" (UID: "06e447f3-796d-4ff8-a960-8bb125768a59"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.098051 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06e447f3-796d-4ff8-a960-8bb125768a59-kube-api-access-rt4q6" (OuterVolumeSpecName: "kube-api-access-rt4q6") pod "06e447f3-796d-4ff8-a960-8bb125768a59" (UID: "06e447f3-796d-4ff8-a960-8bb125768a59"). InnerVolumeSpecName "kube-api-access-rt4q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.198546 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt4q6\" (UniqueName: \"kubernetes.io/projected/06e447f3-796d-4ff8-a960-8bb125768a59-kube-api-access-rt4q6\") on node \"crc\" DevicePath \"\"" Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.198593 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06e447f3-796d-4ff8-a960-8bb125768a59-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.198618 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06e447f3-796d-4ff8-a960-8bb125768a59-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.464601 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9jsgc"] Dec 01 09:26:41 crc kubenswrapper[4867]: W1201 09:26:41.470985 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f709b6_1fa8_40fb_80a0_45ea9510b009.slice/crio-6b3059374e013f7628a4e82faf85ef0fc351e8801e5cb7bb6434f05c3e5b04a5 WatchSource:0}: Error finding container 6b3059374e013f7628a4e82faf85ef0fc351e8801e5cb7bb6434f05c3e5b04a5: Status 404 returned error can't find the container with id 6b3059374e013f7628a4e82faf85ef0fc351e8801e5cb7bb6434f05c3e5b04a5 Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.479374 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-jm5gj" event={"ID":"06e447f3-796d-4ff8-a960-8bb125768a59","Type":"ContainerDied","Data":"2e9bb13043cd0eabfd465fe091443a80a5c0895318e44774ce4436163f2ba0d1"} Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.479403 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jm5gj" Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.481404 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-8sg62" event={"ID":"5f68c337-8402-4802-bd14-1590ae5d0a3e","Type":"ContainerDied","Data":"43af29006f6d3c8cffdf61efa65cfb45d8cd4e690bfc83dc294195b9e7674836"} Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.481678 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8sg62" Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.484961 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b1081877-3550-4ad4-9a89-a5cddfc4ba31","Type":"ContainerStarted","Data":"777c596a74fc6b3257d4bda04c49e420916fa387ba94afe22845ff0f585a3ad4"} Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.490150 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vmh2x" event={"ID":"aa810b5f-4cad-40cc-9feb-6afc38b56ab1","Type":"ContainerStarted","Data":"27d1cad1fc36721b4879bd1dc1f0f2fb1a5e6ee7d3c938cdbe74349c1a6dd1b2"} Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.532870 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8sg62"] Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.539857 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8sg62"] Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.569001 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jm5gj"] Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.578464 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jm5gj"] Dec 01 09:26:41 crc kubenswrapper[4867]: I1201 09:26:41.696282 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 09:26:42 crc kubenswrapper[4867]: I1201 09:26:42.500926 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9jsgc" event={"ID":"91f709b6-1fa8-40fb-80a0-45ea9510b009","Type":"ContainerStarted","Data":"6b3059374e013f7628a4e82faf85ef0fc351e8801e5cb7bb6434f05c3e5b04a5"} Dec 01 09:26:42 crc kubenswrapper[4867]: I1201 09:26:42.502685 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df","Type":"ContainerStarted","Data":"6e17bda509936adf367e8c8cf21c58120c0d0ad929c4d1901aa887445d9c858b"} Dec 01 09:26:42 crc kubenswrapper[4867]: I1201 09:26:42.836087 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06e447f3-796d-4ff8-a960-8bb125768a59" path="/var/lib/kubelet/pods/06e447f3-796d-4ff8-a960-8bb125768a59/volumes" Dec 01 09:26:42 crc kubenswrapper[4867]: I1201 09:26:42.836585 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f68c337-8402-4802-bd14-1590ae5d0a3e" path="/var/lib/kubelet/pods/5f68c337-8402-4802-bd14-1590ae5d0a3e/volumes" Dec 01 09:26:45 crc kubenswrapper[4867]: I1201 09:26:45.525585 4867 generic.go:334] "Generic (PLEG): container finished" podID="31106653-bdaa-49c3-b14c-8eb180b0b2c3" containerID="ff2daa296402ee7069e468953ab2d4b70d226c94b4affecd173199f365b6e1e7" exitCode=0 Dec 01 09:26:45 crc kubenswrapper[4867]: I1201 09:26:45.525623 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"31106653-bdaa-49c3-b14c-8eb180b0b2c3","Type":"ContainerDied","Data":"ff2daa296402ee7069e468953ab2d4b70d226c94b4affecd173199f365b6e1e7"} Dec 01 09:26:46 crc kubenswrapper[4867]: I1201 09:26:46.074786 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 01 09:26:48 crc kubenswrapper[4867]: I1201 09:26:48.549599 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8","Type":"ContainerStarted","Data":"768ebae0302c574701bea670cd34f62f6b71dcf20233fc774d68da808a48647e"} Dec 01 09:26:48 crc kubenswrapper[4867]: I1201 09:26:48.553183 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df","Type":"ContainerStarted","Data":"2a2545f9510ee6de332202759b1feb1e268b6cdc73ac2dfeeff30d1d29637238"} Dec 01 09:26:48 crc kubenswrapper[4867]: I1201 09:26:48.555858 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"31106653-bdaa-49c3-b14c-8eb180b0b2c3","Type":"ContainerStarted","Data":"cde67702757a02838f56462cf468b774dbc9aa3482e80cfd5fc4816b9245fab1"} Dec 01 09:26:48 crc kubenswrapper[4867]: I1201 09:26:48.557704 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b1081877-3550-4ad4-9a89-a5cddfc4ba31","Type":"ContainerStarted","Data":"97a5119fd4f1d815abf17174a02237270b4f1da527e4142c6d184d93e79b0ad5"} Dec 01 09:26:48 crc kubenswrapper[4867]: I1201 09:26:48.559168 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vmh2x" event={"ID":"aa810b5f-4cad-40cc-9feb-6afc38b56ab1","Type":"ContainerStarted","Data":"d9039888121078585011676b87f20e4603e216066697ab11b539e9146cfb022d"} Dec 01 09:26:48 crc kubenswrapper[4867]: I1201 09:26:48.559375 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-vmh2x" Dec 01 09:26:48 crc kubenswrapper[4867]: I1201 09:26:48.560420 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1847ceb6-9b89-4f58-8bc0-d70d28fe4890","Type":"ContainerStarted","Data":"4323803bf8fb0816a71ad49c6b0efb002dda412b606f62d82add8d6e4cdbba16"} Dec 01 09:26:48 crc kubenswrapper[4867]: I1201 09:26:48.560553 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 01 09:26:48 crc kubenswrapper[4867]: I1201 09:26:48.561684 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9jsgc" event={"ID":"91f709b6-1fa8-40fb-80a0-45ea9510b009","Type":"ContainerStarted","Data":"f2cf49b4f9247fa24fa15dc13c041334f452a34cca3731b40e9d3a019c4dfbf8"} Dec 01 09:26:48 crc kubenswrapper[4867]: I1201 09:26:48.615744 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=11.521832832 podStartE2EDuration="39.615718722s" podCreationTimestamp="2025-12-01 09:26:09 +0000 UTC" firstStartedPulling="2025-12-01 09:26:11.705125856 +0000 UTC m=+1093.164512610" lastFinishedPulling="2025-12-01 09:26:39.799011746 +0000 UTC m=+1121.258398500" observedRunningTime="2025-12-01 09:26:48.608224347 +0000 UTC m=+1130.067611101" watchObservedRunningTime="2025-12-01 09:26:48.615718722 +0000 UTC m=+1130.075105476" Dec 01 09:26:48 crc kubenswrapper[4867]: I1201 09:26:48.629808 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-vmh2x" podStartSLOduration=24.448681741 podStartE2EDuration="31.629788048s" podCreationTimestamp="2025-12-01 09:26:17 +0000 UTC" firstStartedPulling="2025-12-01 09:26:40.536574882 +0000 UTC m=+1121.995961636" lastFinishedPulling="2025-12-01 09:26:47.717681199 +0000 UTC m=+1129.177067943" observedRunningTime="2025-12-01 09:26:48.625421909 +0000 UTC m=+1130.084808663" watchObservedRunningTime="2025-12-01 09:26:48.629788048 +0000 UTC m=+1130.089174812" Dec 01 09:26:48 crc kubenswrapper[4867]: I1201 09:26:48.651532 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=29.448794394 podStartE2EDuration="36.651510345s" podCreationTimestamp="2025-12-01 09:26:12 +0000 UTC" firstStartedPulling="2025-12-01 09:26:40.436068351 +0000 UTC m=+1121.895455115" lastFinishedPulling="2025-12-01 09:26:47.638784312 +0000 UTC m=+1129.098171066" observedRunningTime="2025-12-01 09:26:48.644270026 +0000 UTC m=+1130.103656800" watchObservedRunningTime="2025-12-01 09:26:48.651510345 +0000 UTC m=+1130.110897099" Dec 01 09:26:49 crc kubenswrapper[4867]: I1201 09:26:49.574640 4867 generic.go:334] "Generic (PLEG): container finished" podID="91f709b6-1fa8-40fb-80a0-45ea9510b009" containerID="f2cf49b4f9247fa24fa15dc13c041334f452a34cca3731b40e9d3a019c4dfbf8" exitCode=0 Dec 01 09:26:49 crc kubenswrapper[4867]: I1201 09:26:49.574749 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9jsgc" event={"ID":"91f709b6-1fa8-40fb-80a0-45ea9510b009","Type":"ContainerDied","Data":"f2cf49b4f9247fa24fa15dc13c041334f452a34cca3731b40e9d3a019c4dfbf8"} Dec 01 09:26:49 crc kubenswrapper[4867]: I1201 09:26:49.586943 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7f260d89-a8a0-4d49-a34a-a36a06ef2eee","Type":"ContainerStarted","Data":"b266880f24fcd0138fc752fed1bbccea596621df22b09f8f05090abd3f8acacc"} Dec 01 09:26:49 crc kubenswrapper[4867]: I1201 09:26:49.589947 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63bff526-5063-4326-8b3c-0c580320be58","Type":"ContainerStarted","Data":"bc1fcb2e09f9f9eb3b3ed5dd075ca24185a23aff2a86d37d58acb4a60948ed1f"} Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.043104 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-pfvw2"] Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.044708 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-pfvw2" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.046545 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.059663 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-pfvw2"] Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.176194 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feff7c40-c771-4824-b3f0-75c4d527044a-combined-ca-bundle\") pod \"ovn-controller-metrics-pfvw2\" (UID: \"feff7c40-c771-4824-b3f0-75c4d527044a\") " pod="openstack/ovn-controller-metrics-pfvw2" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.176431 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feff7c40-c771-4824-b3f0-75c4d527044a-config\") pod \"ovn-controller-metrics-pfvw2\" (UID: \"feff7c40-c771-4824-b3f0-75c4d527044a\") " pod="openstack/ovn-controller-metrics-pfvw2" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.176494 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/feff7c40-c771-4824-b3f0-75c4d527044a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pfvw2\" (UID: \"feff7c40-c771-4824-b3f0-75c4d527044a\") " pod="openstack/ovn-controller-metrics-pfvw2" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.176530 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/feff7c40-c771-4824-b3f0-75c4d527044a-ovs-rundir\") pod \"ovn-controller-metrics-pfvw2\" (UID: \"feff7c40-c771-4824-b3f0-75c4d527044a\") " pod="openstack/ovn-controller-metrics-pfvw2" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.176571 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/feff7c40-c771-4824-b3f0-75c4d527044a-ovn-rundir\") pod \"ovn-controller-metrics-pfvw2\" (UID: \"feff7c40-c771-4824-b3f0-75c4d527044a\") " pod="openstack/ovn-controller-metrics-pfvw2" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.176603 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98g84\" (UniqueName: \"kubernetes.io/projected/feff7c40-c771-4824-b3f0-75c4d527044a-kube-api-access-98g84\") pod \"ovn-controller-metrics-pfvw2\" (UID: \"feff7c40-c771-4824-b3f0-75c4d527044a\") " pod="openstack/ovn-controller-metrics-pfvw2" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.228285 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-k5wvx"] Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.262593 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vkxgx"] Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.263829 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-vkxgx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.266873 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.277859 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feff7c40-c771-4824-b3f0-75c4d527044a-combined-ca-bundle\") pod \"ovn-controller-metrics-pfvw2\" (UID: \"feff7c40-c771-4824-b3f0-75c4d527044a\") " pod="openstack/ovn-controller-metrics-pfvw2" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.277913 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feff7c40-c771-4824-b3f0-75c4d527044a-config\") pod \"ovn-controller-metrics-pfvw2\" (UID: \"feff7c40-c771-4824-b3f0-75c4d527044a\") " pod="openstack/ovn-controller-metrics-pfvw2" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.277935 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/feff7c40-c771-4824-b3f0-75c4d527044a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pfvw2\" (UID: \"feff7c40-c771-4824-b3f0-75c4d527044a\") " pod="openstack/ovn-controller-metrics-pfvw2" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.277954 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/feff7c40-c771-4824-b3f0-75c4d527044a-ovn-rundir\") pod \"ovn-controller-metrics-pfvw2\" (UID: \"feff7c40-c771-4824-b3f0-75c4d527044a\") " pod="openstack/ovn-controller-metrics-pfvw2" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.277968 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98g84\" (UniqueName: \"kubernetes.io/projected/feff7c40-c771-4824-b3f0-75c4d527044a-kube-api-access-98g84\") pod \"ovn-controller-metrics-pfvw2\" (UID: \"feff7c40-c771-4824-b3f0-75c4d527044a\") " pod="openstack/ovn-controller-metrics-pfvw2" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.277982 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/feff7c40-c771-4824-b3f0-75c4d527044a-ovs-rundir\") pod \"ovn-controller-metrics-pfvw2\" (UID: \"feff7c40-c771-4824-b3f0-75c4d527044a\") " pod="openstack/ovn-controller-metrics-pfvw2" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.279235 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vkxgx"] Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.279244 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/feff7c40-c771-4824-b3f0-75c4d527044a-ovs-rundir\") pod \"ovn-controller-metrics-pfvw2\" (UID: \"feff7c40-c771-4824-b3f0-75c4d527044a\") " pod="openstack/ovn-controller-metrics-pfvw2" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.279252 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/feff7c40-c771-4824-b3f0-75c4d527044a-ovn-rundir\") pod \"ovn-controller-metrics-pfvw2\" (UID: \"feff7c40-c771-4824-b3f0-75c4d527044a\") " pod="openstack/ovn-controller-metrics-pfvw2" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.279799 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feff7c40-c771-4824-b3f0-75c4d527044a-config\") pod \"ovn-controller-metrics-pfvw2\" (UID: \"feff7c40-c771-4824-b3f0-75c4d527044a\") " pod="openstack/ovn-controller-metrics-pfvw2" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.291977 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feff7c40-c771-4824-b3f0-75c4d527044a-combined-ca-bundle\") pod \"ovn-controller-metrics-pfvw2\" (UID: \"feff7c40-c771-4824-b3f0-75c4d527044a\") " pod="openstack/ovn-controller-metrics-pfvw2" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.301464 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/feff7c40-c771-4824-b3f0-75c4d527044a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pfvw2\" (UID: \"feff7c40-c771-4824-b3f0-75c4d527044a\") " pod="openstack/ovn-controller-metrics-pfvw2" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.337268 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98g84\" (UniqueName: \"kubernetes.io/projected/feff7c40-c771-4824-b3f0-75c4d527044a-kube-api-access-98g84\") pod \"ovn-controller-metrics-pfvw2\" (UID: \"feff7c40-c771-4824-b3f0-75c4d527044a\") " pod="openstack/ovn-controller-metrics-pfvw2" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.361253 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-pfvw2" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.381899 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b3976a5-814d-4585-b1e3-a22a6933a44d-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-vkxgx\" (UID: \"5b3976a5-814d-4585-b1e3-a22a6933a44d\") " pod="openstack/dnsmasq-dns-7f896c8c65-vkxgx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.385371 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b3976a5-814d-4585-b1e3-a22a6933a44d-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-vkxgx\" (UID: \"5b3976a5-814d-4585-b1e3-a22a6933a44d\") " pod="openstack/dnsmasq-dns-7f896c8c65-vkxgx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.385717 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rg5t\" (UniqueName: \"kubernetes.io/projected/5b3976a5-814d-4585-b1e3-a22a6933a44d-kube-api-access-9rg5t\") pod \"dnsmasq-dns-7f896c8c65-vkxgx\" (UID: \"5b3976a5-814d-4585-b1e3-a22a6933a44d\") " pod="openstack/dnsmasq-dns-7f896c8c65-vkxgx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.385941 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b3976a5-814d-4585-b1e3-a22a6933a44d-config\") pod \"dnsmasq-dns-7f896c8c65-vkxgx\" (UID: \"5b3976a5-814d-4585-b1e3-a22a6933a44d\") " pod="openstack/dnsmasq-dns-7f896c8c65-vkxgx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.488609 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rg5t\" (UniqueName: \"kubernetes.io/projected/5b3976a5-814d-4585-b1e3-a22a6933a44d-kube-api-access-9rg5t\") pod \"dnsmasq-dns-7f896c8c65-vkxgx\" (UID: \"5b3976a5-814d-4585-b1e3-a22a6933a44d\") " pod="openstack/dnsmasq-dns-7f896c8c65-vkxgx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.488784 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b3976a5-814d-4585-b1e3-a22a6933a44d-config\") pod \"dnsmasq-dns-7f896c8c65-vkxgx\" (UID: \"5b3976a5-814d-4585-b1e3-a22a6933a44d\") " pod="openstack/dnsmasq-dns-7f896c8c65-vkxgx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.488963 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b3976a5-814d-4585-b1e3-a22a6933a44d-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-vkxgx\" (UID: \"5b3976a5-814d-4585-b1e3-a22a6933a44d\") " pod="openstack/dnsmasq-dns-7f896c8c65-vkxgx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.489128 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b3976a5-814d-4585-b1e3-a22a6933a44d-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-vkxgx\" (UID: \"5b3976a5-814d-4585-b1e3-a22a6933a44d\") " pod="openstack/dnsmasq-dns-7f896c8c65-vkxgx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.490940 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b3976a5-814d-4585-b1e3-a22a6933a44d-config\") pod \"dnsmasq-dns-7f896c8c65-vkxgx\" (UID: \"5b3976a5-814d-4585-b1e3-a22a6933a44d\") " pod="openstack/dnsmasq-dns-7f896c8c65-vkxgx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.491804 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b3976a5-814d-4585-b1e3-a22a6933a44d-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-vkxgx\" (UID: \"5b3976a5-814d-4585-b1e3-a22a6933a44d\") " pod="openstack/dnsmasq-dns-7f896c8c65-vkxgx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.491971 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b3976a5-814d-4585-b1e3-a22a6933a44d-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-vkxgx\" (UID: \"5b3976a5-814d-4585-b1e3-a22a6933a44d\") " pod="openstack/dnsmasq-dns-7f896c8c65-vkxgx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.518080 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rg5t\" (UniqueName: \"kubernetes.io/projected/5b3976a5-814d-4585-b1e3-a22a6933a44d-kube-api-access-9rg5t\") pod \"dnsmasq-dns-7f896c8c65-vkxgx\" (UID: \"5b3976a5-814d-4585-b1e3-a22a6933a44d\") " pod="openstack/dnsmasq-dns-7f896c8c65-vkxgx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.555671 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pmmrl"] Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.586248 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-vkxgx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.656340 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9jsgc" event={"ID":"91f709b6-1fa8-40fb-80a0-45ea9510b009","Type":"ContainerStarted","Data":"354c370894d846dee3f4efe8b23e01e1b3bb26779844c8630c23e38478a20ec4"} Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.672883 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4jmxx"] Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.674589 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.682887 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.693483 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4jmxx"] Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.778021 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.778291 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.803548 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb2tk\" (UniqueName: \"kubernetes.io/projected/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-kube-api-access-nb2tk\") pod \"dnsmasq-dns-86db49b7ff-4jmxx\" (UID: \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.822425 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-config\") pod \"dnsmasq-dns-86db49b7ff-4jmxx\" (UID: \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.825859 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-4jmxx\" (UID: \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.825970 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-4jmxx\" (UID: \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.826000 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-4jmxx\" (UID: \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.927455 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-4jmxx\" (UID: \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.927501 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-4jmxx\" (UID: \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.927559 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb2tk\" (UniqueName: \"kubernetes.io/projected/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-kube-api-access-nb2tk\") pod \"dnsmasq-dns-86db49b7ff-4jmxx\" (UID: \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.927620 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-config\") pod \"dnsmasq-dns-86db49b7ff-4jmxx\" (UID: \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.927679 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-4jmxx\" (UID: \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.928618 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-4jmxx\" (UID: \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.928735 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-4jmxx\" (UID: \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.929444 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-4jmxx\" (UID: \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.929573 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-config\") pod \"dnsmasq-dns-86db49b7ff-4jmxx\" (UID: \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:26:50 crc kubenswrapper[4867]: I1201 09:26:50.948785 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb2tk\" (UniqueName: \"kubernetes.io/projected/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-kube-api-access-nb2tk\") pod \"dnsmasq-dns-86db49b7ff-4jmxx\" (UID: \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\") " pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:26:51 crc kubenswrapper[4867]: I1201 09:26:51.005740 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:26:51 crc kubenswrapper[4867]: I1201 09:26:51.601239 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:26:51 crc kubenswrapper[4867]: I1201 09:26:51.601310 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:26:51 crc kubenswrapper[4867]: I1201 09:26:51.601434 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:26:51 crc kubenswrapper[4867]: I1201 09:26:51.602311 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3efb00c27c0eaaf97b5cf3c44be1e5a5598923c3a199804003a6c5848c9f9cea"} pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:26:51 crc kubenswrapper[4867]: I1201 09:26:51.602356 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" containerID="cri-o://3efb00c27c0eaaf97b5cf3c44be1e5a5598923c3a199804003a6c5848c9f9cea" gracePeriod=600 Dec 01 09:26:52 crc kubenswrapper[4867]: I1201 09:26:52.672649 4867 generic.go:334] "Generic (PLEG): container finished" podID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerID="3efb00c27c0eaaf97b5cf3c44be1e5a5598923c3a199804003a6c5848c9f9cea" exitCode=0 Dec 01 09:26:52 crc kubenswrapper[4867]: I1201 09:26:52.672689 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerDied","Data":"3efb00c27c0eaaf97b5cf3c44be1e5a5598923c3a199804003a6c5848c9f9cea"} Dec 01 09:26:52 crc kubenswrapper[4867]: I1201 09:26:52.672745 4867 scope.go:117] "RemoveContainer" containerID="793e6afb196d113ee707de55107187443477607da81a9336611cc7c60ae9f91b" Dec 01 09:26:52 crc kubenswrapper[4867]: I1201 09:26:52.921132 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vkxgx"] Dec 01 09:26:52 crc kubenswrapper[4867]: I1201 09:26:52.992590 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-jc265"] Dec 01 09:26:52 crc kubenswrapper[4867]: I1201 09:26:52.993929 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:26:53 crc kubenswrapper[4867]: I1201 09:26:53.021962 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 01 09:26:53 crc kubenswrapper[4867]: I1201 09:26:53.067742 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jc265"] Dec 01 09:26:53 crc kubenswrapper[4867]: I1201 09:26:53.161013 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvpzn\" (UniqueName: \"kubernetes.io/projected/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-kube-api-access-nvpzn\") pod \"dnsmasq-dns-698758b865-jc265\" (UID: \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\") " pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:26:53 crc kubenswrapper[4867]: I1201 09:26:53.161100 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-dns-svc\") pod \"dnsmasq-dns-698758b865-jc265\" (UID: \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\") " pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:26:53 crc kubenswrapper[4867]: I1201 09:26:53.161123 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jc265\" (UID: \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\") " pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:26:53 crc kubenswrapper[4867]: I1201 09:26:53.161145 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jc265\" (UID: \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\") " pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:26:53 crc kubenswrapper[4867]: I1201 09:26:53.161177 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-config\") pod \"dnsmasq-dns-698758b865-jc265\" (UID: \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\") " pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:26:53 crc kubenswrapper[4867]: I1201 09:26:53.262491 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvpzn\" (UniqueName: \"kubernetes.io/projected/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-kube-api-access-nvpzn\") pod \"dnsmasq-dns-698758b865-jc265\" (UID: \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\") " pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:26:53 crc kubenswrapper[4867]: I1201 09:26:53.262876 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-dns-svc\") pod \"dnsmasq-dns-698758b865-jc265\" (UID: \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\") " pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:26:53 crc kubenswrapper[4867]: I1201 09:26:53.262903 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jc265\" (UID: \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\") " pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:26:53 crc kubenswrapper[4867]: I1201 09:26:53.263753 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-dns-svc\") pod \"dnsmasq-dns-698758b865-jc265\" (UID: \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\") " pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:26:53 crc kubenswrapper[4867]: I1201 09:26:53.263844 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jc265\" (UID: \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\") " pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:26:53 crc kubenswrapper[4867]: I1201 09:26:53.263867 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jc265\" (UID: \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\") " pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:26:53 crc kubenswrapper[4867]: I1201 09:26:53.263879 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-config\") pod \"dnsmasq-dns-698758b865-jc265\" (UID: \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\") " pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:26:53 crc kubenswrapper[4867]: I1201 09:26:53.264396 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jc265\" (UID: \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\") " pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:26:53 crc kubenswrapper[4867]: I1201 09:26:53.264945 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-config\") pod \"dnsmasq-dns-698758b865-jc265\" (UID: \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\") " pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:26:53 crc kubenswrapper[4867]: I1201 09:26:53.304785 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvpzn\" (UniqueName: \"kubernetes.io/projected/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-kube-api-access-nvpzn\") pod \"dnsmasq-dns-698758b865-jc265\" (UID: \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\") " pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:26:53 crc kubenswrapper[4867]: I1201 09:26:53.314186 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.052546 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.058184 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.061681 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-lxwgv" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.061958 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.062113 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.062318 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.086179 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.178234 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.178643 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-cache\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.178673 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.178714 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsh85\" (UniqueName: \"kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-kube-api-access-tsh85\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.178856 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-lock\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.280493 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.280585 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-cache\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.280619 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.280664 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsh85\" (UniqueName: \"kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-kube-api-access-tsh85\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.280691 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-lock\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.280901 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Dec 01 09:26:54 crc kubenswrapper[4867]: E1201 09:26:54.281301 4867 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 09:26:54 crc kubenswrapper[4867]: E1201 09:26:54.281354 4867 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 09:26:54 crc kubenswrapper[4867]: E1201 09:26:54.281416 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift podName:e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3 nodeName:}" failed. No retries permitted until 2025-12-01 09:26:54.781395481 +0000 UTC m=+1136.240782235 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift") pod "swift-storage-0" (UID: "e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3") : configmap "swift-ring-files" not found Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.281438 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-lock\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.303963 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsh85\" (UniqueName: \"kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-kube-api-access-tsh85\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.304121 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.446989 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-cache\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.568210 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-n24dx"] Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.569363 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.572164 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.572222 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.572598 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.599927 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-n24dx"] Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.686171 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b301a3-7288-471a-8ca4-cb7f4dca4b96-combined-ca-bundle\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.686276 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/14b301a3-7288-471a-8ca4-cb7f4dca4b96-dispersionconf\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.686353 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khqsr\" (UniqueName: \"kubernetes.io/projected/14b301a3-7288-471a-8ca4-cb7f4dca4b96-kube-api-access-khqsr\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.686387 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/14b301a3-7288-471a-8ca4-cb7f4dca4b96-etc-swift\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.686412 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/14b301a3-7288-471a-8ca4-cb7f4dca4b96-ring-data-devices\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.686457 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/14b301a3-7288-471a-8ca4-cb7f4dca4b96-swiftconf\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.686578 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14b301a3-7288-471a-8ca4-cb7f4dca4b96-scripts\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.687893 4867 generic.go:334] "Generic (PLEG): container finished" podID="7a36be7a-7b6d-443d-94c6-4b3bdff15ec8" containerID="768ebae0302c574701bea670cd34f62f6b71dcf20233fc774d68da808a48647e" exitCode=0 Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.687927 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8","Type":"ContainerDied","Data":"768ebae0302c574701bea670cd34f62f6b71dcf20233fc774d68da808a48647e"} Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.788500 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14b301a3-7288-471a-8ca4-cb7f4dca4b96-scripts\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.788590 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b301a3-7288-471a-8ca4-cb7f4dca4b96-combined-ca-bundle\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.788617 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.788678 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/14b301a3-7288-471a-8ca4-cb7f4dca4b96-dispersionconf\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.788730 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khqsr\" (UniqueName: \"kubernetes.io/projected/14b301a3-7288-471a-8ca4-cb7f4dca4b96-kube-api-access-khqsr\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.788759 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/14b301a3-7288-471a-8ca4-cb7f4dca4b96-etc-swift\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.788782 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/14b301a3-7288-471a-8ca4-cb7f4dca4b96-ring-data-devices\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.788845 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/14b301a3-7288-471a-8ca4-cb7f4dca4b96-swiftconf\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: E1201 09:26:54.789450 4867 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 09:26:54 crc kubenswrapper[4867]: E1201 09:26:54.789478 4867 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 09:26:54 crc kubenswrapper[4867]: E1201 09:26:54.789541 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift podName:e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3 nodeName:}" failed. No retries permitted until 2025-12-01 09:26:55.789514825 +0000 UTC m=+1137.248901609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift") pod "swift-storage-0" (UID: "e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3") : configmap "swift-ring-files" not found Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.791074 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14b301a3-7288-471a-8ca4-cb7f4dca4b96-scripts\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.792555 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/14b301a3-7288-471a-8ca4-cb7f4dca4b96-etc-swift\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.792666 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/14b301a3-7288-471a-8ca4-cb7f4dca4b96-swiftconf\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.793029 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/14b301a3-7288-471a-8ca4-cb7f4dca4b96-ring-data-devices\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.794138 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b301a3-7288-471a-8ca4-cb7f4dca4b96-combined-ca-bundle\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.794215 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/14b301a3-7288-471a-8ca4-cb7f4dca4b96-dispersionconf\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.808096 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khqsr\" (UniqueName: \"kubernetes.io/projected/14b301a3-7288-471a-8ca4-cb7f4dca4b96-kube-api-access-khqsr\") pod \"swift-ring-rebalance-n24dx\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:54 crc kubenswrapper[4867]: I1201 09:26:54.901283 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:26:55 crc kubenswrapper[4867]: I1201 09:26:55.809183 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:26:55 crc kubenswrapper[4867]: E1201 09:26:55.809341 4867 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 09:26:55 crc kubenswrapper[4867]: E1201 09:26:55.809358 4867 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 09:26:55 crc kubenswrapper[4867]: E1201 09:26:55.809409 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift podName:e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3 nodeName:}" failed. No retries permitted until 2025-12-01 09:26:57.809393724 +0000 UTC m=+1139.268780478 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift") pod "swift-storage-0" (UID: "e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3") : configmap "swift-ring-files" not found Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.032320 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.080344 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pmmrl" Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.085863 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-k5wvx" Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.215303 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/957496c6-2ac3-42c5-981a-ea77d637bacd-config\") pod \"957496c6-2ac3-42c5-981a-ea77d637bacd\" (UID: \"957496c6-2ac3-42c5-981a-ea77d637bacd\") " Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.215701 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n82jt\" (UniqueName: \"kubernetes.io/projected/d918819a-b715-46bf-95bc-cef73e65ea8d-kube-api-access-n82jt\") pod \"d918819a-b715-46bf-95bc-cef73e65ea8d\" (UID: \"d918819a-b715-46bf-95bc-cef73e65ea8d\") " Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.215745 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957496c6-2ac3-42c5-981a-ea77d637bacd-config" (OuterVolumeSpecName: "config") pod "957496c6-2ac3-42c5-981a-ea77d637bacd" (UID: "957496c6-2ac3-42c5-981a-ea77d637bacd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.215827 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d918819a-b715-46bf-95bc-cef73e65ea8d-config\") pod \"d918819a-b715-46bf-95bc-cef73e65ea8d\" (UID: \"d918819a-b715-46bf-95bc-cef73e65ea8d\") " Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.215875 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-597h5\" (UniqueName: \"kubernetes.io/projected/957496c6-2ac3-42c5-981a-ea77d637bacd-kube-api-access-597h5\") pod \"957496c6-2ac3-42c5-981a-ea77d637bacd\" (UID: \"957496c6-2ac3-42c5-981a-ea77d637bacd\") " Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.215903 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/957496c6-2ac3-42c5-981a-ea77d637bacd-dns-svc\") pod \"957496c6-2ac3-42c5-981a-ea77d637bacd\" (UID: \"957496c6-2ac3-42c5-981a-ea77d637bacd\") " Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.215938 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d918819a-b715-46bf-95bc-cef73e65ea8d-dns-svc\") pod \"d918819a-b715-46bf-95bc-cef73e65ea8d\" (UID: \"d918819a-b715-46bf-95bc-cef73e65ea8d\") " Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.216150 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d918819a-b715-46bf-95bc-cef73e65ea8d-config" (OuterVolumeSpecName: "config") pod "d918819a-b715-46bf-95bc-cef73e65ea8d" (UID: "d918819a-b715-46bf-95bc-cef73e65ea8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.216343 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/957496c6-2ac3-42c5-981a-ea77d637bacd-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.216361 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d918819a-b715-46bf-95bc-cef73e65ea8d-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.216355 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957496c6-2ac3-42c5-981a-ea77d637bacd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "957496c6-2ac3-42c5-981a-ea77d637bacd" (UID: "957496c6-2ac3-42c5-981a-ea77d637bacd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.216434 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d918819a-b715-46bf-95bc-cef73e65ea8d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d918819a-b715-46bf-95bc-cef73e65ea8d" (UID: "d918819a-b715-46bf-95bc-cef73e65ea8d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.219784 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/957496c6-2ac3-42c5-981a-ea77d637bacd-kube-api-access-597h5" (OuterVolumeSpecName: "kube-api-access-597h5") pod "957496c6-2ac3-42c5-981a-ea77d637bacd" (UID: "957496c6-2ac3-42c5-981a-ea77d637bacd"). InnerVolumeSpecName "kube-api-access-597h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.220433 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d918819a-b715-46bf-95bc-cef73e65ea8d-kube-api-access-n82jt" (OuterVolumeSpecName: "kube-api-access-n82jt") pod "d918819a-b715-46bf-95bc-cef73e65ea8d" (UID: "d918819a-b715-46bf-95bc-cef73e65ea8d"). InnerVolumeSpecName "kube-api-access-n82jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.222763 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.317636 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n82jt\" (UniqueName: \"kubernetes.io/projected/d918819a-b715-46bf-95bc-cef73e65ea8d-kube-api-access-n82jt\") on node \"crc\" DevicePath \"\"" Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.317683 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-597h5\" (UniqueName: \"kubernetes.io/projected/957496c6-2ac3-42c5-981a-ea77d637bacd-kube-api-access-597h5\") on node \"crc\" DevicePath \"\"" Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.317698 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/957496c6-2ac3-42c5-981a-ea77d637bacd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.317708 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d918819a-b715-46bf-95bc-cef73e65ea8d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.710631 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-k5wvx" event={"ID":"957496c6-2ac3-42c5-981a-ea77d637bacd","Type":"ContainerDied","Data":"9e60383a59482273a43a29ba6ac0c9f3b916c7205de72ec621aa2146074cfd30"} Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.710663 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-k5wvx" Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.740219 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pmmrl" Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.742047 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pmmrl" event={"ID":"d918819a-b715-46bf-95bc-cef73e65ea8d","Type":"ContainerDied","Data":"1e17bd6835465fcae8ad132f95d2f7a0f7d58ccafc47366f0e48ea1a87ba0e69"} Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.890213 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pmmrl"] Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.900032 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pmmrl"] Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.951254 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-k5wvx"] Dec 01 09:26:56 crc kubenswrapper[4867]: I1201 09:26:56.977736 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-k5wvx"] Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.054867 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4jmxx"] Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.301789 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-n24dx"] Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.318916 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jc265"] Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.332277 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-pfvw2"] Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.343751 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vkxgx"] Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.755166 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9jsgc" event={"ID":"91f709b6-1fa8-40fb-80a0-45ea9510b009","Type":"ContainerStarted","Data":"502dc61d22cb397ce19088939ad961b0b75420a36d9c60923ab11eb2c510595e"} Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.756355 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.756391 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.757577 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-pfvw2" event={"ID":"feff7c40-c771-4824-b3f0-75c4d527044a","Type":"ContainerStarted","Data":"3c321593ca4d478a3bbee5f25643d7c7f6f87e151e9a8062940600903bc805a4"} Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.759304 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df","Type":"ContainerStarted","Data":"f8a80b9ea482c323cb2249bfe224c151dd67d74b8e340935f98b3faa358241cd"} Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.760842 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-n24dx" event={"ID":"14b301a3-7288-471a-8ca4-cb7f4dca4b96","Type":"ContainerStarted","Data":"fbe618455d795ee466130655645fc060bfaca3f93cc296b23ac56c9155ddfe88"} Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.763800 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b1081877-3550-4ad4-9a89-a5cddfc4ba31","Type":"ContainerStarted","Data":"663db05c07fdb5680f0a247276b44081638fbaeed8177be306926e01ab2ed6e3"} Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.766447 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7a36be7a-7b6d-443d-94c6-4b3bdff15ec8","Type":"ContainerStarted","Data":"957f46314285c970974a1997f3cdc67cbabd14ab2dc0027d4c88b995710953f5"} Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.767871 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jc265" event={"ID":"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f","Type":"ContainerStarted","Data":"a86f28cd2cb8a5dd0c400beebbbf10839ff2ee11ba7add54389486116380c3b6"} Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.770311 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-vkxgx" event={"ID":"5b3976a5-814d-4585-b1e3-a22a6933a44d","Type":"ContainerStarted","Data":"7dd0ec9b38d64b23a3d93c542b6b7b6a54df5dc84f67c5ad7c012f4dc7d9cd3f"} Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.773450 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" event={"ID":"2d7ed167-4161-45c6-b1fb-6f4eb307a40f","Type":"ContainerStarted","Data":"a63bcb1cccdc8bd54d7ce964b8466b881871950973344ed5523fbafc1a7da537"} Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.775642 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"7c10630f55a5e3f1966308004b9596564bba3f48b49f2091a432ccd55427b09a"} Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.787320 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-9jsgc" podStartSLOduration=34.540548637 podStartE2EDuration="40.787252262s" podCreationTimestamp="2025-12-01 09:26:17 +0000 UTC" firstStartedPulling="2025-12-01 09:26:41.472778153 +0000 UTC m=+1122.932164907" lastFinishedPulling="2025-12-01 09:26:47.719481778 +0000 UTC m=+1129.178868532" observedRunningTime="2025-12-01 09:26:57.782154012 +0000 UTC m=+1139.241540796" watchObservedRunningTime="2025-12-01 09:26:57.787252262 +0000 UTC m=+1139.246639036" Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.791794 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.849875 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:26:57 crc kubenswrapper[4867]: E1201 09:26:57.857414 4867 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 09:26:57 crc kubenswrapper[4867]: E1201 09:26:57.857442 4867 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 09:26:57 crc kubenswrapper[4867]: E1201 09:26:57.857486 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift podName:e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3 nodeName:}" failed. No retries permitted until 2025-12-01 09:27:01.857470421 +0000 UTC m=+1143.316857175 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift") pod "swift-storage-0" (UID: "e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3") : configmap "swift-ring-files" not found Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.860957 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371985.993832 podStartE2EDuration="50.860944236s" podCreationTimestamp="2025-12-01 09:26:07 +0000 UTC" firstStartedPulling="2025-12-01 09:26:10.090764467 +0000 UTC m=+1091.550151221" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:26:57.830594153 +0000 UTC m=+1139.289980907" watchObservedRunningTime="2025-12-01 09:26:57.860944236 +0000 UTC m=+1139.320330990" Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.874189 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.891566 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=24.264463246 podStartE2EDuration="39.891541876s" podCreationTimestamp="2025-12-01 09:26:18 +0000 UTC" firstStartedPulling="2025-12-01 09:26:40.964218727 +0000 UTC m=+1122.423605481" lastFinishedPulling="2025-12-01 09:26:56.591297357 +0000 UTC m=+1138.050684111" observedRunningTime="2025-12-01 09:26:57.860292088 +0000 UTC m=+1139.319678852" watchObservedRunningTime="2025-12-01 09:26:57.891541876 +0000 UTC m=+1139.350928640" Dec 01 09:26:57 crc kubenswrapper[4867]: I1201 09:26:57.907080 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=29.109796032 podStartE2EDuration="43.907060343s" podCreationTimestamp="2025-12-01 09:26:14 +0000 UTC" firstStartedPulling="2025-12-01 09:26:41.918878355 +0000 UTC m=+1123.378265109" lastFinishedPulling="2025-12-01 09:26:56.716142666 +0000 UTC m=+1138.175529420" observedRunningTime="2025-12-01 09:26:57.899252038 +0000 UTC m=+1139.358638792" watchObservedRunningTime="2025-12-01 09:26:57.907060343 +0000 UTC m=+1139.366447097" Dec 01 09:26:58 crc kubenswrapper[4867]: I1201 09:26:58.786960 4867 generic.go:334] "Generic (PLEG): container finished" podID="7e2bcd3c-d57d-422c-921a-b4fadc65cb6f" containerID="32f1a3127e5e3c543c995e38338570ead3ce15746c76a58d7eab12d67f8d3ee5" exitCode=0 Dec 01 09:26:58 crc kubenswrapper[4867]: I1201 09:26:58.787485 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jc265" event={"ID":"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f","Type":"ContainerDied","Data":"32f1a3127e5e3c543c995e38338570ead3ce15746c76a58d7eab12d67f8d3ee5"} Dec 01 09:26:58 crc kubenswrapper[4867]: I1201 09:26:58.789230 4867 generic.go:334] "Generic (PLEG): container finished" podID="2d7ed167-4161-45c6-b1fb-6f4eb307a40f" containerID="4d592d8d8cf173da73254559e44f6b7c6514b6dbe3a59ff8bae61234fee62ff5" exitCode=0 Dec 01 09:26:58 crc kubenswrapper[4867]: I1201 09:26:58.789616 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" event={"ID":"2d7ed167-4161-45c6-b1fb-6f4eb307a40f","Type":"ContainerDied","Data":"4d592d8d8cf173da73254559e44f6b7c6514b6dbe3a59ff8bae61234fee62ff5"} Dec 01 09:26:58 crc kubenswrapper[4867]: I1201 09:26:58.793854 4867 generic.go:334] "Generic (PLEG): container finished" podID="5b3976a5-814d-4585-b1e3-a22a6933a44d" containerID="0c0ab555d41ffd2d3b2057ba2d824b4f1acb0798d173ea860f3d6d9867c10960" exitCode=0 Dec 01 09:26:58 crc kubenswrapper[4867]: I1201 09:26:58.793936 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-vkxgx" event={"ID":"5b3976a5-814d-4585-b1e3-a22a6933a44d","Type":"ContainerDied","Data":"0c0ab555d41ffd2d3b2057ba2d824b4f1acb0798d173ea860f3d6d9867c10960"} Dec 01 09:26:58 crc kubenswrapper[4867]: I1201 09:26:58.808159 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-pfvw2" event={"ID":"feff7c40-c771-4824-b3f0-75c4d527044a","Type":"ContainerStarted","Data":"630c6b09cdab42e727583107eac6848de21c5f203a627a5c47a5ea9f2ab1ff73"} Dec 01 09:26:58 crc kubenswrapper[4867]: I1201 09:26:58.810777 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:58 crc kubenswrapper[4867]: I1201 09:26:58.849478 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="957496c6-2ac3-42c5-981a-ea77d637bacd" path="/var/lib/kubelet/pods/957496c6-2ac3-42c5-981a-ea77d637bacd/volumes" Dec 01 09:26:58 crc kubenswrapper[4867]: I1201 09:26:58.849880 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d918819a-b715-46bf-95bc-cef73e65ea8d" path="/var/lib/kubelet/pods/d918819a-b715-46bf-95bc-cef73e65ea8d/volumes" Dec 01 09:26:58 crc kubenswrapper[4867]: I1201 09:26:58.887480 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 01 09:26:58 crc kubenswrapper[4867]: I1201 09:26:58.923061 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-pfvw2" podStartSLOduration=8.923043366 podStartE2EDuration="8.923043366s" podCreationTimestamp="2025-12-01 09:26:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:26:58.920513306 +0000 UTC m=+1140.379900060" watchObservedRunningTime="2025-12-01 09:26:58.923043366 +0000 UTC m=+1140.382430130" Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.157191 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:59 crc kubenswrapper[4867]: E1201 09:26:59.175315 4867 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 01 09:26:59 crc kubenswrapper[4867]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/2d7ed167-4161-45c6-b1fb-6f4eb307a40f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 01 09:26:59 crc kubenswrapper[4867]: > podSandboxID="a63bcb1cccdc8bd54d7ce964b8466b881871950973344ed5523fbafc1a7da537" Dec 01 09:26:59 crc kubenswrapper[4867]: E1201 09:26:59.175499 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 01 09:26:59 crc kubenswrapper[4867]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n599h5cbh7ch5d4h66fh676hdbh546h95h88h5ffh55ch7fhch57ch687hddhc7h5fdh57dh674h56fh64ch98h9bh557h55dh646h54ch54fh5c4h597q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nb2tk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86db49b7ff-4jmxx_openstack(2d7ed167-4161-45c6-b1fb-6f4eb307a40f): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/2d7ed167-4161-45c6-b1fb-6f4eb307a40f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 01 09:26:59 crc kubenswrapper[4867]: > logger="UnhandledError" Dec 01 09:26:59 crc kubenswrapper[4867]: E1201 09:26:59.177456 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/2d7ed167-4161-45c6-b1fb-6f4eb307a40f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" podUID="2d7ed167-4161-45c6-b1fb-6f4eb307a40f" Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.241002 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.292940 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.292980 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.308943 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-vkxgx" Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.399270 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b3976a5-814d-4585-b1e3-a22a6933a44d-config\") pod \"5b3976a5-814d-4585-b1e3-a22a6933a44d\" (UID: \"5b3976a5-814d-4585-b1e3-a22a6933a44d\") " Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.399929 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b3976a5-814d-4585-b1e3-a22a6933a44d-dns-svc\") pod \"5b3976a5-814d-4585-b1e3-a22a6933a44d\" (UID: \"5b3976a5-814d-4585-b1e3-a22a6933a44d\") " Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.400007 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b3976a5-814d-4585-b1e3-a22a6933a44d-ovsdbserver-sb\") pod \"5b3976a5-814d-4585-b1e3-a22a6933a44d\" (UID: \"5b3976a5-814d-4585-b1e3-a22a6933a44d\") " Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.400077 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rg5t\" (UniqueName: \"kubernetes.io/projected/5b3976a5-814d-4585-b1e3-a22a6933a44d-kube-api-access-9rg5t\") pod \"5b3976a5-814d-4585-b1e3-a22a6933a44d\" (UID: \"5b3976a5-814d-4585-b1e3-a22a6933a44d\") " Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.406952 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b3976a5-814d-4585-b1e3-a22a6933a44d-kube-api-access-9rg5t" (OuterVolumeSpecName: "kube-api-access-9rg5t") pod "5b3976a5-814d-4585-b1e3-a22a6933a44d" (UID: "5b3976a5-814d-4585-b1e3-a22a6933a44d"). InnerVolumeSpecName "kube-api-access-9rg5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.425703 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b3976a5-814d-4585-b1e3-a22a6933a44d-config" (OuterVolumeSpecName: "config") pod "5b3976a5-814d-4585-b1e3-a22a6933a44d" (UID: "5b3976a5-814d-4585-b1e3-a22a6933a44d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.432005 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b3976a5-814d-4585-b1e3-a22a6933a44d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5b3976a5-814d-4585-b1e3-a22a6933a44d" (UID: "5b3976a5-814d-4585-b1e3-a22a6933a44d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.439017 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b3976a5-814d-4585-b1e3-a22a6933a44d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5b3976a5-814d-4585-b1e3-a22a6933a44d" (UID: "5b3976a5-814d-4585-b1e3-a22a6933a44d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.502112 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rg5t\" (UniqueName: \"kubernetes.io/projected/5b3976a5-814d-4585-b1e3-a22a6933a44d-kube-api-access-9rg5t\") on node \"crc\" DevicePath \"\"" Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.502460 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b3976a5-814d-4585-b1e3-a22a6933a44d-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.502473 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b3976a5-814d-4585-b1e3-a22a6933a44d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.502484 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b3976a5-814d-4585-b1e3-a22a6933a44d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.822838 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jc265" event={"ID":"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f","Type":"ContainerStarted","Data":"ecb815e96f46608a0a9c9c964ae75a0bc5ba7dcc4f294c45ec5ca6157cf1778b"} Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.822913 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.825184 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-vkxgx" Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.828297 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-vkxgx" event={"ID":"5b3976a5-814d-4585-b1e3-a22a6933a44d","Type":"ContainerDied","Data":"7dd0ec9b38d64b23a3d93c542b6b7b6a54df5dc84f67c5ad7c012f4dc7d9cd3f"} Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.828349 4867 scope.go:117] "RemoveContainer" containerID="0c0ab555d41ffd2d3b2057ba2d824b4f1acb0798d173ea860f3d6d9867c10960" Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.829036 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.844477 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-jc265" podStartSLOduration=6.622067938 podStartE2EDuration="7.84446105s" podCreationTimestamp="2025-12-01 09:26:52 +0000 UTC" firstStartedPulling="2025-12-01 09:26:57.348298447 +0000 UTC m=+1138.807685201" lastFinishedPulling="2025-12-01 09:26:58.570691559 +0000 UTC m=+1140.030078313" observedRunningTime="2025-12-01 09:26:59.843800542 +0000 UTC m=+1141.303187306" watchObservedRunningTime="2025-12-01 09:26:59.84446105 +0000 UTC m=+1141.303847804" Dec 01 09:26:59 crc kubenswrapper[4867]: I1201 09:26:59.895551 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.007927 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vkxgx"] Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.023622 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vkxgx"] Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.103928 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 01 09:27:00 crc kubenswrapper[4867]: E1201 09:27:00.104322 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3976a5-814d-4585-b1e3-a22a6933a44d" containerName="init" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.104345 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3976a5-814d-4585-b1e3-a22a6933a44d" containerName="init" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.104548 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b3976a5-814d-4585-b1e3-a22a6933a44d" containerName="init" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.106555 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.115209 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.115397 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.115505 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-665mk" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.115691 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.122968 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.218328 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/826ca141-06c3-45c3-9d5a-e99985971b80-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.218392 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/826ca141-06c3-45c3-9d5a-e99985971b80-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.218428 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/826ca141-06c3-45c3-9d5a-e99985971b80-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.218525 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tpx5\" (UniqueName: \"kubernetes.io/projected/826ca141-06c3-45c3-9d5a-e99985971b80-kube-api-access-9tpx5\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.218553 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826ca141-06c3-45c3-9d5a-e99985971b80-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.218607 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/826ca141-06c3-45c3-9d5a-e99985971b80-config\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.218636 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/826ca141-06c3-45c3-9d5a-e99985971b80-scripts\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.319708 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/826ca141-06c3-45c3-9d5a-e99985971b80-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.319832 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tpx5\" (UniqueName: \"kubernetes.io/projected/826ca141-06c3-45c3-9d5a-e99985971b80-kube-api-access-9tpx5\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.319859 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826ca141-06c3-45c3-9d5a-e99985971b80-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.319902 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/826ca141-06c3-45c3-9d5a-e99985971b80-config\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.319929 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/826ca141-06c3-45c3-9d5a-e99985971b80-scripts\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.319979 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/826ca141-06c3-45c3-9d5a-e99985971b80-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.320010 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/826ca141-06c3-45c3-9d5a-e99985971b80-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.320772 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/826ca141-06c3-45c3-9d5a-e99985971b80-config\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.320946 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/826ca141-06c3-45c3-9d5a-e99985971b80-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.320967 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/826ca141-06c3-45c3-9d5a-e99985971b80-scripts\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.324788 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/826ca141-06c3-45c3-9d5a-e99985971b80-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.328235 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826ca141-06c3-45c3-9d5a-e99985971b80-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.336408 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/826ca141-06c3-45c3-9d5a-e99985971b80-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.337205 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tpx5\" (UniqueName: \"kubernetes.io/projected/826ca141-06c3-45c3-9d5a-e99985971b80-kube-api-access-9tpx5\") pod \"ovn-northd-0\" (UID: \"826ca141-06c3-45c3-9d5a-e99985971b80\") " pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.450511 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 09:27:00 crc kubenswrapper[4867]: I1201 09:27:00.836233 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b3976a5-814d-4585-b1e3-a22a6933a44d" path="/var/lib/kubelet/pods/5b3976a5-814d-4585-b1e3-a22a6933a44d/volumes" Dec 01 09:27:01 crc kubenswrapper[4867]: I1201 09:27:01.948999 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:27:01 crc kubenswrapper[4867]: E1201 09:27:01.949290 4867 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 09:27:01 crc kubenswrapper[4867]: E1201 09:27:01.949466 4867 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 09:27:01 crc kubenswrapper[4867]: E1201 09:27:01.949525 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift podName:e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3 nodeName:}" failed. No retries permitted until 2025-12-01 09:27:09.949505452 +0000 UTC m=+1151.408892206 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift") pod "swift-storage-0" (UID: "e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3") : configmap "swift-ring-files" not found Dec 01 09:27:02 crc kubenswrapper[4867]: I1201 09:27:02.845175 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-n24dx" event={"ID":"14b301a3-7288-471a-8ca4-cb7f4dca4b96","Type":"ContainerStarted","Data":"baeea3a66130554e5f3fd8580c6678c66c0e790053ae39342a891e9caec45b26"} Dec 01 09:27:02 crc kubenswrapper[4867]: I1201 09:27:02.851239 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" event={"ID":"2d7ed167-4161-45c6-b1fb-6f4eb307a40f","Type":"ContainerStarted","Data":"d4e7c740ace2ee75197cbe24b5ef66b79ac865a5d5f7f51949790bf94baf6c0c"} Dec 01 09:27:02 crc kubenswrapper[4867]: I1201 09:27:02.852076 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:27:02 crc kubenswrapper[4867]: I1201 09:27:02.876007 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-n24dx" podStartSLOduration=3.757070394 podStartE2EDuration="8.875986067s" podCreationTimestamp="2025-12-01 09:26:54 +0000 UTC" firstStartedPulling="2025-12-01 09:26:57.315037824 +0000 UTC m=+1138.774424578" lastFinishedPulling="2025-12-01 09:27:02.433953497 +0000 UTC m=+1143.893340251" observedRunningTime="2025-12-01 09:27:02.870701851 +0000 UTC m=+1144.330088615" watchObservedRunningTime="2025-12-01 09:27:02.875986067 +0000 UTC m=+1144.335372821" Dec 01 09:27:02 crc kubenswrapper[4867]: I1201 09:27:02.888593 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 09:27:02 crc kubenswrapper[4867]: W1201 09:27:02.890145 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod826ca141_06c3_45c3_9d5a_e99985971b80.slice/crio-e6879e5ed67107bfdf47101aa7b086e115d90589015c81cd753bf48ffc83c9b4 WatchSource:0}: Error finding container e6879e5ed67107bfdf47101aa7b086e115d90589015c81cd753bf48ffc83c9b4: Status 404 returned error can't find the container with id e6879e5ed67107bfdf47101aa7b086e115d90589015c81cd753bf48ffc83c9b4 Dec 01 09:27:03 crc kubenswrapper[4867]: I1201 09:27:03.863242 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"826ca141-06c3-45c3-9d5a-e99985971b80","Type":"ContainerStarted","Data":"e6879e5ed67107bfdf47101aa7b086e115d90589015c81cd753bf48ffc83c9b4"} Dec 01 09:27:03 crc kubenswrapper[4867]: I1201 09:27:03.908433 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 01 09:27:03 crc kubenswrapper[4867]: I1201 09:27:03.933445 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" podStartSLOduration=13.057027738 podStartE2EDuration="13.933427267s" podCreationTimestamp="2025-12-01 09:26:50 +0000 UTC" firstStartedPulling="2025-12-01 09:26:57.100877872 +0000 UTC m=+1138.560264626" lastFinishedPulling="2025-12-01 09:26:57.977277401 +0000 UTC m=+1139.436664155" observedRunningTime="2025-12-01 09:27:02.903313886 +0000 UTC m=+1144.362700640" watchObservedRunningTime="2025-12-01 09:27:03.933427267 +0000 UTC m=+1145.392814021" Dec 01 09:27:03 crc kubenswrapper[4867]: I1201 09:27:03.987148 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 01 09:27:04 crc kubenswrapper[4867]: I1201 09:27:04.872108 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"826ca141-06c3-45c3-9d5a-e99985971b80","Type":"ContainerStarted","Data":"a39aa47e6920149fab06db67fe2646ce813f0c35216e0cc8abb0362603a47106"} Dec 01 09:27:04 crc kubenswrapper[4867]: I1201 09:27:04.872418 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"826ca141-06c3-45c3-9d5a-e99985971b80","Type":"ContainerStarted","Data":"9e8809acbddb8b275d3a5fb66584d4e6a661388081946385417eae73024ae442"} Dec 01 09:27:04 crc kubenswrapper[4867]: I1201 09:27:04.893831 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.398002712 podStartE2EDuration="4.893793062s" podCreationTimestamp="2025-12-01 09:27:00 +0000 UTC" firstStartedPulling="2025-12-01 09:27:02.891217504 +0000 UTC m=+1144.350604258" lastFinishedPulling="2025-12-01 09:27:04.387007864 +0000 UTC m=+1145.846394608" observedRunningTime="2025-12-01 09:27:04.890573704 +0000 UTC m=+1146.349960458" watchObservedRunningTime="2025-12-01 09:27:04.893793062 +0000 UTC m=+1146.353179816" Dec 01 09:27:05 crc kubenswrapper[4867]: I1201 09:27:05.451274 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.353193 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-776fv"] Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.355901 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-776fv" Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.385895 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-776fv"] Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.429935 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d49ca85-8824-4830-b123-56cc15703c4a-operator-scripts\") pod \"glance-db-create-776fv\" (UID: \"8d49ca85-8824-4830-b123-56cc15703c4a\") " pod="openstack/glance-db-create-776fv" Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.430113 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtf8m\" (UniqueName: \"kubernetes.io/projected/8d49ca85-8824-4830-b123-56cc15703c4a-kube-api-access-mtf8m\") pod \"glance-db-create-776fv\" (UID: \"8d49ca85-8824-4830-b123-56cc15703c4a\") " pod="openstack/glance-db-create-776fv" Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.482586 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-609c-account-create-update-8rb7t"] Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.483714 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-609c-account-create-update-8rb7t" Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.485188 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.491279 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-609c-account-create-update-8rb7t"] Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.532217 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d49ca85-8824-4830-b123-56cc15703c4a-operator-scripts\") pod \"glance-db-create-776fv\" (UID: \"8d49ca85-8824-4830-b123-56cc15703c4a\") " pod="openstack/glance-db-create-776fv" Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.532535 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtf8m\" (UniqueName: \"kubernetes.io/projected/8d49ca85-8824-4830-b123-56cc15703c4a-kube-api-access-mtf8m\") pod \"glance-db-create-776fv\" (UID: \"8d49ca85-8824-4830-b123-56cc15703c4a\") " pod="openstack/glance-db-create-776fv" Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.533024 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d49ca85-8824-4830-b123-56cc15703c4a-operator-scripts\") pod \"glance-db-create-776fv\" (UID: \"8d49ca85-8824-4830-b123-56cc15703c4a\") " pod="openstack/glance-db-create-776fv" Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.563478 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtf8m\" (UniqueName: \"kubernetes.io/projected/8d49ca85-8824-4830-b123-56cc15703c4a-kube-api-access-mtf8m\") pod \"glance-db-create-776fv\" (UID: \"8d49ca85-8824-4830-b123-56cc15703c4a\") " pod="openstack/glance-db-create-776fv" Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.634553 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8622r\" (UniqueName: \"kubernetes.io/projected/64b3a692-2503-4619-8b20-090eefce0fa5-kube-api-access-8622r\") pod \"glance-609c-account-create-update-8rb7t\" (UID: \"64b3a692-2503-4619-8b20-090eefce0fa5\") " pod="openstack/glance-609c-account-create-update-8rb7t" Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.634697 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64b3a692-2503-4619-8b20-090eefce0fa5-operator-scripts\") pod \"glance-609c-account-create-update-8rb7t\" (UID: \"64b3a692-2503-4619-8b20-090eefce0fa5\") " pod="openstack/glance-609c-account-create-update-8rb7t" Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.695679 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-776fv" Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.736606 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64b3a692-2503-4619-8b20-090eefce0fa5-operator-scripts\") pod \"glance-609c-account-create-update-8rb7t\" (UID: \"64b3a692-2503-4619-8b20-090eefce0fa5\") " pod="openstack/glance-609c-account-create-update-8rb7t" Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.736767 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8622r\" (UniqueName: \"kubernetes.io/projected/64b3a692-2503-4619-8b20-090eefce0fa5-kube-api-access-8622r\") pod \"glance-609c-account-create-update-8rb7t\" (UID: \"64b3a692-2503-4619-8b20-090eefce0fa5\") " pod="openstack/glance-609c-account-create-update-8rb7t" Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.737785 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64b3a692-2503-4619-8b20-090eefce0fa5-operator-scripts\") pod \"glance-609c-account-create-update-8rb7t\" (UID: \"64b3a692-2503-4619-8b20-090eefce0fa5\") " pod="openstack/glance-609c-account-create-update-8rb7t" Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.752917 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8622r\" (UniqueName: \"kubernetes.io/projected/64b3a692-2503-4619-8b20-090eefce0fa5-kube-api-access-8622r\") pod \"glance-609c-account-create-update-8rb7t\" (UID: \"64b3a692-2503-4619-8b20-090eefce0fa5\") " pod="openstack/glance-609c-account-create-update-8rb7t" Dec 01 09:27:06 crc kubenswrapper[4867]: I1201 09:27:06.808764 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-609c-account-create-update-8rb7t" Dec 01 09:27:07 crc kubenswrapper[4867]: I1201 09:27:07.836372 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-609c-account-create-update-8rb7t"] Dec 01 09:27:07 crc kubenswrapper[4867]: I1201 09:27:07.876952 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-776fv"] Dec 01 09:27:07 crc kubenswrapper[4867]: W1201 09:27:07.883261 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d49ca85_8824_4830_b123_56cc15703c4a.slice/crio-255b4900667a69771ed6be71a62c3ff86a4a3437b2c5141ae24cf7b57f5137c6 WatchSource:0}: Error finding container 255b4900667a69771ed6be71a62c3ff86a4a3437b2c5141ae24cf7b57f5137c6: Status 404 returned error can't find the container with id 255b4900667a69771ed6be71a62c3ff86a4a3437b2c5141ae24cf7b57f5137c6 Dec 01 09:27:07 crc kubenswrapper[4867]: I1201 09:27:07.893463 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-609c-account-create-update-8rb7t" event={"ID":"64b3a692-2503-4619-8b20-090eefce0fa5","Type":"ContainerStarted","Data":"95d106d6d57c2647e03a1c9ab25c9a8a1bdc320aeda6ea082aa97b3ad9a531dd"} Dec 01 09:27:07 crc kubenswrapper[4867]: I1201 09:27:07.894542 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-776fv" event={"ID":"8d49ca85-8824-4830-b123-56cc15703c4a","Type":"ContainerStarted","Data":"255b4900667a69771ed6be71a62c3ff86a4a3437b2c5141ae24cf7b57f5137c6"} Dec 01 09:27:08 crc kubenswrapper[4867]: I1201 09:27:08.316698 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:27:08 crc kubenswrapper[4867]: I1201 09:27:08.401962 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4jmxx"] Dec 01 09:27:08 crc kubenswrapper[4867]: I1201 09:27:08.402687 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" podUID="2d7ed167-4161-45c6-b1fb-6f4eb307a40f" containerName="dnsmasq-dns" containerID="cri-o://d4e7c740ace2ee75197cbe24b5ef66b79ac865a5d5f7f51949790bf94baf6c0c" gracePeriod=10 Dec 01 09:27:08 crc kubenswrapper[4867]: I1201 09:27:08.404168 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:27:08 crc kubenswrapper[4867]: I1201 09:27:08.907360 4867 generic.go:334] "Generic (PLEG): container finished" podID="2d7ed167-4161-45c6-b1fb-6f4eb307a40f" containerID="d4e7c740ace2ee75197cbe24b5ef66b79ac865a5d5f7f51949790bf94baf6c0c" exitCode=0 Dec 01 09:27:08 crc kubenswrapper[4867]: I1201 09:27:08.907422 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" event={"ID":"2d7ed167-4161-45c6-b1fb-6f4eb307a40f","Type":"ContainerDied","Data":"d4e7c740ace2ee75197cbe24b5ef66b79ac865a5d5f7f51949790bf94baf6c0c"} Dec 01 09:27:08 crc kubenswrapper[4867]: I1201 09:27:08.908615 4867 generic.go:334] "Generic (PLEG): container finished" podID="64b3a692-2503-4619-8b20-090eefce0fa5" containerID="1c0bea86ba295a268d5b3df1c30112bbfb69bc2d940da45279664813da345de8" exitCode=0 Dec 01 09:27:08 crc kubenswrapper[4867]: I1201 09:27:08.908656 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-609c-account-create-update-8rb7t" event={"ID":"64b3a692-2503-4619-8b20-090eefce0fa5","Type":"ContainerDied","Data":"1c0bea86ba295a268d5b3df1c30112bbfb69bc2d940da45279664813da345de8"} Dec 01 09:27:08 crc kubenswrapper[4867]: I1201 09:27:08.910069 4867 generic.go:334] "Generic (PLEG): container finished" podID="8d49ca85-8824-4830-b123-56cc15703c4a" containerID="862230330f020c1118a30a0d5648a3c649a9f2bb119c0cfa9cda5b932d73be92" exitCode=0 Dec 01 09:27:08 crc kubenswrapper[4867]: I1201 09:27:08.910088 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-776fv" event={"ID":"8d49ca85-8824-4830-b123-56cc15703c4a","Type":"ContainerDied","Data":"862230330f020c1118a30a0d5648a3c649a9f2bb119c0cfa9cda5b932d73be92"} Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.128455 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.290023 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb2tk\" (UniqueName: \"kubernetes.io/projected/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-kube-api-access-nb2tk\") pod \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\" (UID: \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\") " Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.290407 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-dns-svc\") pod \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\" (UID: \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\") " Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.290464 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-ovsdbserver-sb\") pod \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\" (UID: \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\") " Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.290487 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-config\") pod \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\" (UID: \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\") " Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.290568 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-ovsdbserver-nb\") pod \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\" (UID: \"2d7ed167-4161-45c6-b1fb-6f4eb307a40f\") " Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.295960 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-kube-api-access-nb2tk" (OuterVolumeSpecName: "kube-api-access-nb2tk") pod "2d7ed167-4161-45c6-b1fb-6f4eb307a40f" (UID: "2d7ed167-4161-45c6-b1fb-6f4eb307a40f"). InnerVolumeSpecName "kube-api-access-nb2tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.348662 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-config" (OuterVolumeSpecName: "config") pod "2d7ed167-4161-45c6-b1fb-6f4eb307a40f" (UID: "2d7ed167-4161-45c6-b1fb-6f4eb307a40f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.350233 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2d7ed167-4161-45c6-b1fb-6f4eb307a40f" (UID: "2d7ed167-4161-45c6-b1fb-6f4eb307a40f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.353125 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2d7ed167-4161-45c6-b1fb-6f4eb307a40f" (UID: "2d7ed167-4161-45c6-b1fb-6f4eb307a40f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.381061 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2d7ed167-4161-45c6-b1fb-6f4eb307a40f" (UID: "2d7ed167-4161-45c6-b1fb-6f4eb307a40f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.392749 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.392780 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb2tk\" (UniqueName: \"kubernetes.io/projected/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-kube-api-access-nb2tk\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.392792 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.392799 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.392806 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d7ed167-4161-45c6-b1fb-6f4eb307a40f-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.919259 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.922926 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4jmxx" event={"ID":"2d7ed167-4161-45c6-b1fb-6f4eb307a40f","Type":"ContainerDied","Data":"a63bcb1cccdc8bd54d7ce964b8466b881871950973344ed5523fbafc1a7da537"} Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.924113 4867 scope.go:117] "RemoveContainer" containerID="d4e7c740ace2ee75197cbe24b5ef66b79ac865a5d5f7f51949790bf94baf6c0c" Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.944126 4867 scope.go:117] "RemoveContainer" containerID="4d592d8d8cf173da73254559e44f6b7c6514b6dbe3a59ff8bae61234fee62ff5" Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.966179 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4jmxx"] Dec 01 09:27:09 crc kubenswrapper[4867]: I1201 09:27:09.981589 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4jmxx"] Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.001972 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:27:10 crc kubenswrapper[4867]: E1201 09:27:10.002172 4867 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 09:27:10 crc kubenswrapper[4867]: E1201 09:27:10.002198 4867 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 09:27:10 crc kubenswrapper[4867]: E1201 09:27:10.002260 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift podName:e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3 nodeName:}" failed. No retries permitted until 2025-12-01 09:27:26.002241967 +0000 UTC m=+1167.461628711 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift") pod "swift-storage-0" (UID: "e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3") : configmap "swift-ring-files" not found Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.337088 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-609c-account-create-update-8rb7t" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.421315 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64b3a692-2503-4619-8b20-090eefce0fa5-operator-scripts\") pod \"64b3a692-2503-4619-8b20-090eefce0fa5\" (UID: \"64b3a692-2503-4619-8b20-090eefce0fa5\") " Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.421516 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8622r\" (UniqueName: \"kubernetes.io/projected/64b3a692-2503-4619-8b20-090eefce0fa5-kube-api-access-8622r\") pod \"64b3a692-2503-4619-8b20-090eefce0fa5\" (UID: \"64b3a692-2503-4619-8b20-090eefce0fa5\") " Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.422215 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64b3a692-2503-4619-8b20-090eefce0fa5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "64b3a692-2503-4619-8b20-090eefce0fa5" (UID: "64b3a692-2503-4619-8b20-090eefce0fa5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.429204 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64b3a692-2503-4619-8b20-090eefce0fa5-kube-api-access-8622r" (OuterVolumeSpecName: "kube-api-access-8622r") pod "64b3a692-2503-4619-8b20-090eefce0fa5" (UID: "64b3a692-2503-4619-8b20-090eefce0fa5"). InnerVolumeSpecName "kube-api-access-8622r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.518717 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-776fv" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.523285 4867 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64b3a692-2503-4619-8b20-090eefce0fa5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.523324 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8622r\" (UniqueName: \"kubernetes.io/projected/64b3a692-2503-4619-8b20-090eefce0fa5-kube-api-access-8622r\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.595794 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-vhjsc"] Dec 01 09:27:10 crc kubenswrapper[4867]: E1201 09:27:10.596139 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b3a692-2503-4619-8b20-090eefce0fa5" containerName="mariadb-account-create-update" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.596151 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b3a692-2503-4619-8b20-090eefce0fa5" containerName="mariadb-account-create-update" Dec 01 09:27:10 crc kubenswrapper[4867]: E1201 09:27:10.596165 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d7ed167-4161-45c6-b1fb-6f4eb307a40f" containerName="init" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.596170 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d7ed167-4161-45c6-b1fb-6f4eb307a40f" containerName="init" Dec 01 09:27:10 crc kubenswrapper[4867]: E1201 09:27:10.596197 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d7ed167-4161-45c6-b1fb-6f4eb307a40f" containerName="dnsmasq-dns" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.596202 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d7ed167-4161-45c6-b1fb-6f4eb307a40f" containerName="dnsmasq-dns" Dec 01 09:27:10 crc kubenswrapper[4867]: E1201 09:27:10.596211 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d49ca85-8824-4830-b123-56cc15703c4a" containerName="mariadb-database-create" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.596217 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d49ca85-8824-4830-b123-56cc15703c4a" containerName="mariadb-database-create" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.596362 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d7ed167-4161-45c6-b1fb-6f4eb307a40f" containerName="dnsmasq-dns" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.596388 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d49ca85-8824-4830-b123-56cc15703c4a" containerName="mariadb-database-create" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.596399 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="64b3a692-2503-4619-8b20-090eefce0fa5" containerName="mariadb-account-create-update" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.596912 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vhjsc" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.611671 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-vhjsc"] Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.624846 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtf8m\" (UniqueName: \"kubernetes.io/projected/8d49ca85-8824-4830-b123-56cc15703c4a-kube-api-access-mtf8m\") pod \"8d49ca85-8824-4830-b123-56cc15703c4a\" (UID: \"8d49ca85-8824-4830-b123-56cc15703c4a\") " Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.624968 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d49ca85-8824-4830-b123-56cc15703c4a-operator-scripts\") pod \"8d49ca85-8824-4830-b123-56cc15703c4a\" (UID: \"8d49ca85-8824-4830-b123-56cc15703c4a\") " Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.625958 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d49ca85-8824-4830-b123-56cc15703c4a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d49ca85-8824-4830-b123-56cc15703c4a" (UID: "8d49ca85-8824-4830-b123-56cc15703c4a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.629389 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d49ca85-8824-4830-b123-56cc15703c4a-kube-api-access-mtf8m" (OuterVolumeSpecName: "kube-api-access-mtf8m") pod "8d49ca85-8824-4830-b123-56cc15703c4a" (UID: "8d49ca85-8824-4830-b123-56cc15703c4a"). InnerVolumeSpecName "kube-api-access-mtf8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.726430 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdl45\" (UniqueName: \"kubernetes.io/projected/2ec5a60c-5b6d-49b0-b34c-f61df33220a5-kube-api-access-cdl45\") pod \"keystone-db-create-vhjsc\" (UID: \"2ec5a60c-5b6d-49b0-b34c-f61df33220a5\") " pod="openstack/keystone-db-create-vhjsc" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.726506 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ec5a60c-5b6d-49b0-b34c-f61df33220a5-operator-scripts\") pod \"keystone-db-create-vhjsc\" (UID: \"2ec5a60c-5b6d-49b0-b34c-f61df33220a5\") " pod="openstack/keystone-db-create-vhjsc" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.726615 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtf8m\" (UniqueName: \"kubernetes.io/projected/8d49ca85-8824-4830-b123-56cc15703c4a-kube-api-access-mtf8m\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.726628 4867 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d49ca85-8824-4830-b123-56cc15703c4a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.729691 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5e76-account-create-update-8gv64"] Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.730762 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5e76-account-create-update-8gv64" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.732565 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.741053 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5e76-account-create-update-8gv64"] Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.827383 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdl45\" (UniqueName: \"kubernetes.io/projected/2ec5a60c-5b6d-49b0-b34c-f61df33220a5-kube-api-access-cdl45\") pod \"keystone-db-create-vhjsc\" (UID: \"2ec5a60c-5b6d-49b0-b34c-f61df33220a5\") " pod="openstack/keystone-db-create-vhjsc" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.827441 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ec5a60c-5b6d-49b0-b34c-f61df33220a5-operator-scripts\") pod \"keystone-db-create-vhjsc\" (UID: \"2ec5a60c-5b6d-49b0-b34c-f61df33220a5\") " pod="openstack/keystone-db-create-vhjsc" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.827488 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl768\" (UniqueName: \"kubernetes.io/projected/3baefa6f-f6ea-43ce-978c-dcd5be45de35-kube-api-access-fl768\") pod \"keystone-5e76-account-create-update-8gv64\" (UID: \"3baefa6f-f6ea-43ce-978c-dcd5be45de35\") " pod="openstack/keystone-5e76-account-create-update-8gv64" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.827533 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3baefa6f-f6ea-43ce-978c-dcd5be45de35-operator-scripts\") pod \"keystone-5e76-account-create-update-8gv64\" (UID: \"3baefa6f-f6ea-43ce-978c-dcd5be45de35\") " pod="openstack/keystone-5e76-account-create-update-8gv64" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.828371 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ec5a60c-5b6d-49b0-b34c-f61df33220a5-operator-scripts\") pod \"keystone-db-create-vhjsc\" (UID: \"2ec5a60c-5b6d-49b0-b34c-f61df33220a5\") " pod="openstack/keystone-db-create-vhjsc" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.836425 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d7ed167-4161-45c6-b1fb-6f4eb307a40f" path="/var/lib/kubelet/pods/2d7ed167-4161-45c6-b1fb-6f4eb307a40f/volumes" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.849192 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdl45\" (UniqueName: \"kubernetes.io/projected/2ec5a60c-5b6d-49b0-b34c-f61df33220a5-kube-api-access-cdl45\") pod \"keystone-db-create-vhjsc\" (UID: \"2ec5a60c-5b6d-49b0-b34c-f61df33220a5\") " pod="openstack/keystone-db-create-vhjsc" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.912959 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vhjsc" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.929030 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl768\" (UniqueName: \"kubernetes.io/projected/3baefa6f-f6ea-43ce-978c-dcd5be45de35-kube-api-access-fl768\") pod \"keystone-5e76-account-create-update-8gv64\" (UID: \"3baefa6f-f6ea-43ce-978c-dcd5be45de35\") " pod="openstack/keystone-5e76-account-create-update-8gv64" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.929154 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3baefa6f-f6ea-43ce-978c-dcd5be45de35-operator-scripts\") pod \"keystone-5e76-account-create-update-8gv64\" (UID: \"3baefa6f-f6ea-43ce-978c-dcd5be45de35\") " pod="openstack/keystone-5e76-account-create-update-8gv64" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.930735 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3baefa6f-f6ea-43ce-978c-dcd5be45de35-operator-scripts\") pod \"keystone-5e76-account-create-update-8gv64\" (UID: \"3baefa6f-f6ea-43ce-978c-dcd5be45de35\") " pod="openstack/keystone-5e76-account-create-update-8gv64" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.933376 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-609c-account-create-update-8rb7t" event={"ID":"64b3a692-2503-4619-8b20-090eefce0fa5","Type":"ContainerDied","Data":"95d106d6d57c2647e03a1c9ab25c9a8a1bdc320aeda6ea082aa97b3ad9a531dd"} Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.933408 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-609c-account-create-update-8rb7t" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.933417 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95d106d6d57c2647e03a1c9ab25c9a8a1bdc320aeda6ea082aa97b3ad9a531dd" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.941632 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-776fv" event={"ID":"8d49ca85-8824-4830-b123-56cc15703c4a","Type":"ContainerDied","Data":"255b4900667a69771ed6be71a62c3ff86a4a3437b2c5141ae24cf7b57f5137c6"} Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.941662 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-776fv" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.941670 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="255b4900667a69771ed6be71a62c3ff86a4a3437b2c5141ae24cf7b57f5137c6" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.948542 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl768\" (UniqueName: \"kubernetes.io/projected/3baefa6f-f6ea-43ce-978c-dcd5be45de35-kube-api-access-fl768\") pod \"keystone-5e76-account-create-update-8gv64\" (UID: \"3baefa6f-f6ea-43ce-978c-dcd5be45de35\") " pod="openstack/keystone-5e76-account-create-update-8gv64" Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.989962 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-vnj5m"] Dec 01 09:27:10 crc kubenswrapper[4867]: I1201 09:27:10.991116 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vnj5m" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.000070 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-vnj5m"] Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.054284 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5e76-account-create-update-8gv64" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.132323 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c9d7-account-create-update-sqvdl"] Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.133492 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c9d7-account-create-update-sqvdl" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.144951 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.151624 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c9d7-account-create-update-sqvdl"] Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.154376 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10286952-3989-4e1b-ab98-2971420319da-operator-scripts\") pod \"placement-db-create-vnj5m\" (UID: \"10286952-3989-4e1b-ab98-2971420319da\") " pod="openstack/placement-db-create-vnj5m" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.154439 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptlm9\" (UniqueName: \"kubernetes.io/projected/10286952-3989-4e1b-ab98-2971420319da-kube-api-access-ptlm9\") pod \"placement-db-create-vnj5m\" (UID: \"10286952-3989-4e1b-ab98-2971420319da\") " pod="openstack/placement-db-create-vnj5m" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.257240 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkjf4\" (UniqueName: \"kubernetes.io/projected/8228e65b-ce56-48da-b9c5-770632f03a1c-kube-api-access-gkjf4\") pod \"placement-c9d7-account-create-update-sqvdl\" (UID: \"8228e65b-ce56-48da-b9c5-770632f03a1c\") " pod="openstack/placement-c9d7-account-create-update-sqvdl" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.257366 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10286952-3989-4e1b-ab98-2971420319da-operator-scripts\") pod \"placement-db-create-vnj5m\" (UID: \"10286952-3989-4e1b-ab98-2971420319da\") " pod="openstack/placement-db-create-vnj5m" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.257429 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptlm9\" (UniqueName: \"kubernetes.io/projected/10286952-3989-4e1b-ab98-2971420319da-kube-api-access-ptlm9\") pod \"placement-db-create-vnj5m\" (UID: \"10286952-3989-4e1b-ab98-2971420319da\") " pod="openstack/placement-db-create-vnj5m" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.257458 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8228e65b-ce56-48da-b9c5-770632f03a1c-operator-scripts\") pod \"placement-c9d7-account-create-update-sqvdl\" (UID: \"8228e65b-ce56-48da-b9c5-770632f03a1c\") " pod="openstack/placement-c9d7-account-create-update-sqvdl" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.258277 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10286952-3989-4e1b-ab98-2971420319da-operator-scripts\") pod \"placement-db-create-vnj5m\" (UID: \"10286952-3989-4e1b-ab98-2971420319da\") " pod="openstack/placement-db-create-vnj5m" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.286724 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptlm9\" (UniqueName: \"kubernetes.io/projected/10286952-3989-4e1b-ab98-2971420319da-kube-api-access-ptlm9\") pod \"placement-db-create-vnj5m\" (UID: \"10286952-3989-4e1b-ab98-2971420319da\") " pod="openstack/placement-db-create-vnj5m" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.359611 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8228e65b-ce56-48da-b9c5-770632f03a1c-operator-scripts\") pod \"placement-c9d7-account-create-update-sqvdl\" (UID: \"8228e65b-ce56-48da-b9c5-770632f03a1c\") " pod="openstack/placement-c9d7-account-create-update-sqvdl" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.360031 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkjf4\" (UniqueName: \"kubernetes.io/projected/8228e65b-ce56-48da-b9c5-770632f03a1c-kube-api-access-gkjf4\") pod \"placement-c9d7-account-create-update-sqvdl\" (UID: \"8228e65b-ce56-48da-b9c5-770632f03a1c\") " pod="openstack/placement-c9d7-account-create-update-sqvdl" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.360935 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8228e65b-ce56-48da-b9c5-770632f03a1c-operator-scripts\") pod \"placement-c9d7-account-create-update-sqvdl\" (UID: \"8228e65b-ce56-48da-b9c5-770632f03a1c\") " pod="openstack/placement-c9d7-account-create-update-sqvdl" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.383291 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkjf4\" (UniqueName: \"kubernetes.io/projected/8228e65b-ce56-48da-b9c5-770632f03a1c-kube-api-access-gkjf4\") pod \"placement-c9d7-account-create-update-sqvdl\" (UID: \"8228e65b-ce56-48da-b9c5-770632f03a1c\") " pod="openstack/placement-c9d7-account-create-update-sqvdl" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.499124 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vnj5m" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.514233 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c9d7-account-create-update-sqvdl" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.599249 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-vhjsc"] Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.730879 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5e76-account-create-update-8gv64"] Dec 01 09:27:11 crc kubenswrapper[4867]: W1201 09:27:11.756208 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3baefa6f_f6ea_43ce_978c_dcd5be45de35.slice/crio-7299b95dc40247f75bd279099018e2a187b8f0bd94d98d0f34b012faa1638e35 WatchSource:0}: Error finding container 7299b95dc40247f75bd279099018e2a187b8f0bd94d98d0f34b012faa1638e35: Status 404 returned error can't find the container with id 7299b95dc40247f75bd279099018e2a187b8f0bd94d98d0f34b012faa1638e35 Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.761637 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-jx8tw"] Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.762910 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jx8tw" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.765262 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-qmkvk" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.765918 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.784903 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jx8tw"] Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.849536 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-vnj5m"] Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.881893 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t42kj\" (UniqueName: \"kubernetes.io/projected/1964493b-eb78-487f-8210-3f6323e55583-kube-api-access-t42kj\") pod \"glance-db-sync-jx8tw\" (UID: \"1964493b-eb78-487f-8210-3f6323e55583\") " pod="openstack/glance-db-sync-jx8tw" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.881935 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-db-sync-config-data\") pod \"glance-db-sync-jx8tw\" (UID: \"1964493b-eb78-487f-8210-3f6323e55583\") " pod="openstack/glance-db-sync-jx8tw" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.881957 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-combined-ca-bundle\") pod \"glance-db-sync-jx8tw\" (UID: \"1964493b-eb78-487f-8210-3f6323e55583\") " pod="openstack/glance-db-sync-jx8tw" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.882049 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-config-data\") pod \"glance-db-sync-jx8tw\" (UID: \"1964493b-eb78-487f-8210-3f6323e55583\") " pod="openstack/glance-db-sync-jx8tw" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.935717 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c9d7-account-create-update-sqvdl"] Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.951110 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5e76-account-create-update-8gv64" event={"ID":"3baefa6f-f6ea-43ce-978c-dcd5be45de35","Type":"ContainerStarted","Data":"7299b95dc40247f75bd279099018e2a187b8f0bd94d98d0f34b012faa1638e35"} Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.953759 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vhjsc" event={"ID":"2ec5a60c-5b6d-49b0-b34c-f61df33220a5","Type":"ContainerStarted","Data":"32bdec604d4bbcd7757a02fe3f148ffa88f5409b101ca86636d66c77ebcaf824"} Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.953958 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vhjsc" event={"ID":"2ec5a60c-5b6d-49b0-b34c-f61df33220a5","Type":"ContainerStarted","Data":"5aa2c08a0d3b5ab02a25396f40ca406bd4b17d9743d7bbfc1bda823fa2987122"} Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.957174 4867 generic.go:334] "Generic (PLEG): container finished" podID="14b301a3-7288-471a-8ca4-cb7f4dca4b96" containerID="baeea3a66130554e5f3fd8580c6678c66c0e790053ae39342a891e9caec45b26" exitCode=0 Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.957251 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-n24dx" event={"ID":"14b301a3-7288-471a-8ca4-cb7f4dca4b96","Type":"ContainerDied","Data":"baeea3a66130554e5f3fd8580c6678c66c0e790053ae39342a891e9caec45b26"} Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.959528 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vnj5m" event={"ID":"10286952-3989-4e1b-ab98-2971420319da","Type":"ContainerStarted","Data":"d544f49991a89273de43a354d5c6c23b3a6882462b7a2624c278f3bf7b90bcb8"} Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.983936 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t42kj\" (UniqueName: \"kubernetes.io/projected/1964493b-eb78-487f-8210-3f6323e55583-kube-api-access-t42kj\") pod \"glance-db-sync-jx8tw\" (UID: \"1964493b-eb78-487f-8210-3f6323e55583\") " pod="openstack/glance-db-sync-jx8tw" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.983973 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-db-sync-config-data\") pod \"glance-db-sync-jx8tw\" (UID: \"1964493b-eb78-487f-8210-3f6323e55583\") " pod="openstack/glance-db-sync-jx8tw" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.983990 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-combined-ca-bundle\") pod \"glance-db-sync-jx8tw\" (UID: \"1964493b-eb78-487f-8210-3f6323e55583\") " pod="openstack/glance-db-sync-jx8tw" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.984065 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-config-data\") pod \"glance-db-sync-jx8tw\" (UID: \"1964493b-eb78-487f-8210-3f6323e55583\") " pod="openstack/glance-db-sync-jx8tw" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.990169 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-combined-ca-bundle\") pod \"glance-db-sync-jx8tw\" (UID: \"1964493b-eb78-487f-8210-3f6323e55583\") " pod="openstack/glance-db-sync-jx8tw" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.991379 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-db-sync-config-data\") pod \"glance-db-sync-jx8tw\" (UID: \"1964493b-eb78-487f-8210-3f6323e55583\") " pod="openstack/glance-db-sync-jx8tw" Dec 01 09:27:11 crc kubenswrapper[4867]: I1201 09:27:11.994702 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-config-data\") pod \"glance-db-sync-jx8tw\" (UID: \"1964493b-eb78-487f-8210-3f6323e55583\") " pod="openstack/glance-db-sync-jx8tw" Dec 01 09:27:12 crc kubenswrapper[4867]: I1201 09:27:12.014592 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t42kj\" (UniqueName: \"kubernetes.io/projected/1964493b-eb78-487f-8210-3f6323e55583-kube-api-access-t42kj\") pod \"glance-db-sync-jx8tw\" (UID: \"1964493b-eb78-487f-8210-3f6323e55583\") " pod="openstack/glance-db-sync-jx8tw" Dec 01 09:27:12 crc kubenswrapper[4867]: I1201 09:27:12.082013 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jx8tw" Dec 01 09:27:12 crc kubenswrapper[4867]: I1201 09:27:12.601246 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jx8tw"] Dec 01 09:27:12 crc kubenswrapper[4867]: I1201 09:27:12.974674 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vnj5m" event={"ID":"10286952-3989-4e1b-ab98-2971420319da","Type":"ContainerStarted","Data":"ab340cc80d79466b8050c6d7e32f02329c6c3f3e4e47a95d100d8acd09266503"} Dec 01 09:27:12 crc kubenswrapper[4867]: I1201 09:27:12.978316 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5e76-account-create-update-8gv64" event={"ID":"3baefa6f-f6ea-43ce-978c-dcd5be45de35","Type":"ContainerStarted","Data":"53ca0f9e759b18b286bb20502ca8495c896f1c281c3f8441f2ad1262de6d2f4a"} Dec 01 09:27:12 crc kubenswrapper[4867]: I1201 09:27:12.982642 4867 generic.go:334] "Generic (PLEG): container finished" podID="2ec5a60c-5b6d-49b0-b34c-f61df33220a5" containerID="32bdec604d4bbcd7757a02fe3f148ffa88f5409b101ca86636d66c77ebcaf824" exitCode=0 Dec 01 09:27:12 crc kubenswrapper[4867]: I1201 09:27:12.982794 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vhjsc" event={"ID":"2ec5a60c-5b6d-49b0-b34c-f61df33220a5","Type":"ContainerDied","Data":"32bdec604d4bbcd7757a02fe3f148ffa88f5409b101ca86636d66c77ebcaf824"} Dec 01 09:27:12 crc kubenswrapper[4867]: I1201 09:27:12.987614 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jx8tw" event={"ID":"1964493b-eb78-487f-8210-3f6323e55583","Type":"ContainerStarted","Data":"5ed3a6831fe6cf1041a0d8c63047d225882c38c1180009c19a7c7cd99405633f"} Dec 01 09:27:12 crc kubenswrapper[4867]: I1201 09:27:12.989318 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c9d7-account-create-update-sqvdl" event={"ID":"8228e65b-ce56-48da-b9c5-770632f03a1c","Type":"ContainerStarted","Data":"17cfad9e0f0952326b47e5b9fa4203f91932baf87f4b928c74c5cf60866d1760"} Dec 01 09:27:12 crc kubenswrapper[4867]: I1201 09:27:12.989346 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c9d7-account-create-update-sqvdl" event={"ID":"8228e65b-ce56-48da-b9c5-770632f03a1c","Type":"ContainerStarted","Data":"dde6eabc758ea1e3a3418dee3c111f40a7ef04b134ae7bd1d92598b99ec13c76"} Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.004680 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-vnj5m" podStartSLOduration=3.004661454 podStartE2EDuration="3.004661454s" podCreationTimestamp="2025-12-01 09:27:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:27:12.994436903 +0000 UTC m=+1154.453823667" watchObservedRunningTime="2025-12-01 09:27:13.004661454 +0000 UTC m=+1154.464048208" Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.036093 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5e76-account-create-update-8gv64" podStartSLOduration=3.0360642860000002 podStartE2EDuration="3.036064286s" podCreationTimestamp="2025-12-01 09:27:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:27:13.03402492 +0000 UTC m=+1154.493411674" watchObservedRunningTime="2025-12-01 09:27:13.036064286 +0000 UTC m=+1154.495451040" Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.067180 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-c9d7-account-create-update-sqvdl" podStartSLOduration=2.067160301 podStartE2EDuration="2.067160301s" podCreationTimestamp="2025-12-01 09:27:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:27:13.060396694 +0000 UTC m=+1154.519783458" watchObservedRunningTime="2025-12-01 09:27:13.067160301 +0000 UTC m=+1154.526547055" Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.510959 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.611538 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/14b301a3-7288-471a-8ca4-cb7f4dca4b96-etc-swift\") pod \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.611939 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khqsr\" (UniqueName: \"kubernetes.io/projected/14b301a3-7288-471a-8ca4-cb7f4dca4b96-kube-api-access-khqsr\") pod \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.612074 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14b301a3-7288-471a-8ca4-cb7f4dca4b96-scripts\") pod \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.612130 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/14b301a3-7288-471a-8ca4-cb7f4dca4b96-dispersionconf\") pod \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.612166 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/14b301a3-7288-471a-8ca4-cb7f4dca4b96-ring-data-devices\") pod \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.612242 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/14b301a3-7288-471a-8ca4-cb7f4dca4b96-swiftconf\") pod \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.612283 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b301a3-7288-471a-8ca4-cb7f4dca4b96-combined-ca-bundle\") pod \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\" (UID: \"14b301a3-7288-471a-8ca4-cb7f4dca4b96\") " Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.615309 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14b301a3-7288-471a-8ca4-cb7f4dca4b96-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "14b301a3-7288-471a-8ca4-cb7f4dca4b96" (UID: "14b301a3-7288-471a-8ca4-cb7f4dca4b96"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.616166 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14b301a3-7288-471a-8ca4-cb7f4dca4b96-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "14b301a3-7288-471a-8ca4-cb7f4dca4b96" (UID: "14b301a3-7288-471a-8ca4-cb7f4dca4b96"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.617133 4867 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/14b301a3-7288-471a-8ca4-cb7f4dca4b96-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.617186 4867 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/14b301a3-7288-471a-8ca4-cb7f4dca4b96-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.618688 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14b301a3-7288-471a-8ca4-cb7f4dca4b96-kube-api-access-khqsr" (OuterVolumeSpecName: "kube-api-access-khqsr") pod "14b301a3-7288-471a-8ca4-cb7f4dca4b96" (UID: "14b301a3-7288-471a-8ca4-cb7f4dca4b96"). InnerVolumeSpecName "kube-api-access-khqsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.620949 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b301a3-7288-471a-8ca4-cb7f4dca4b96-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "14b301a3-7288-471a-8ca4-cb7f4dca4b96" (UID: "14b301a3-7288-471a-8ca4-cb7f4dca4b96"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.665408 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b301a3-7288-471a-8ca4-cb7f4dca4b96-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "14b301a3-7288-471a-8ca4-cb7f4dca4b96" (UID: "14b301a3-7288-471a-8ca4-cb7f4dca4b96"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.667269 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b301a3-7288-471a-8ca4-cb7f4dca4b96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14b301a3-7288-471a-8ca4-cb7f4dca4b96" (UID: "14b301a3-7288-471a-8ca4-cb7f4dca4b96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.667459 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14b301a3-7288-471a-8ca4-cb7f4dca4b96-scripts" (OuterVolumeSpecName: "scripts") pod "14b301a3-7288-471a-8ca4-cb7f4dca4b96" (UID: "14b301a3-7288-471a-8ca4-cb7f4dca4b96"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.718762 4867 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/14b301a3-7288-471a-8ca4-cb7f4dca4b96-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.718805 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b301a3-7288-471a-8ca4-cb7f4dca4b96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.718847 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khqsr\" (UniqueName: \"kubernetes.io/projected/14b301a3-7288-471a-8ca4-cb7f4dca4b96-kube-api-access-khqsr\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.718862 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14b301a3-7288-471a-8ca4-cb7f4dca4b96-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:13 crc kubenswrapper[4867]: I1201 09:27:13.718873 4867 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/14b301a3-7288-471a-8ca4-cb7f4dca4b96-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:14 crc kubenswrapper[4867]: I1201 09:27:14.012187 4867 generic.go:334] "Generic (PLEG): container finished" podID="10286952-3989-4e1b-ab98-2971420319da" containerID="ab340cc80d79466b8050c6d7e32f02329c6c3f3e4e47a95d100d8acd09266503" exitCode=0 Dec 01 09:27:14 crc kubenswrapper[4867]: I1201 09:27:14.012252 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vnj5m" event={"ID":"10286952-3989-4e1b-ab98-2971420319da","Type":"ContainerDied","Data":"ab340cc80d79466b8050c6d7e32f02329c6c3f3e4e47a95d100d8acd09266503"} Dec 01 09:27:14 crc kubenswrapper[4867]: I1201 09:27:14.014745 4867 generic.go:334] "Generic (PLEG): container finished" podID="3baefa6f-f6ea-43ce-978c-dcd5be45de35" containerID="53ca0f9e759b18b286bb20502ca8495c896f1c281c3f8441f2ad1262de6d2f4a" exitCode=0 Dec 01 09:27:14 crc kubenswrapper[4867]: I1201 09:27:14.014962 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5e76-account-create-update-8gv64" event={"ID":"3baefa6f-f6ea-43ce-978c-dcd5be45de35","Type":"ContainerDied","Data":"53ca0f9e759b18b286bb20502ca8495c896f1c281c3f8441f2ad1262de6d2f4a"} Dec 01 09:27:14 crc kubenswrapper[4867]: I1201 09:27:14.022141 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-n24dx" event={"ID":"14b301a3-7288-471a-8ca4-cb7f4dca4b96","Type":"ContainerDied","Data":"fbe618455d795ee466130655645fc060bfaca3f93cc296b23ac56c9155ddfe88"} Dec 01 09:27:14 crc kubenswrapper[4867]: I1201 09:27:14.022335 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbe618455d795ee466130655645fc060bfaca3f93cc296b23ac56c9155ddfe88" Dec 01 09:27:14 crc kubenswrapper[4867]: I1201 09:27:14.024984 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n24dx" Dec 01 09:27:14 crc kubenswrapper[4867]: I1201 09:27:14.043915 4867 generic.go:334] "Generic (PLEG): container finished" podID="8228e65b-ce56-48da-b9c5-770632f03a1c" containerID="17cfad9e0f0952326b47e5b9fa4203f91932baf87f4b928c74c5cf60866d1760" exitCode=0 Dec 01 09:27:14 crc kubenswrapper[4867]: I1201 09:27:14.044115 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c9d7-account-create-update-sqvdl" event={"ID":"8228e65b-ce56-48da-b9c5-770632f03a1c","Type":"ContainerDied","Data":"17cfad9e0f0952326b47e5b9fa4203f91932baf87f4b928c74c5cf60866d1760"} Dec 01 09:27:14 crc kubenswrapper[4867]: I1201 09:27:14.380090 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vhjsc" Dec 01 09:27:14 crc kubenswrapper[4867]: I1201 09:27:14.530524 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdl45\" (UniqueName: \"kubernetes.io/projected/2ec5a60c-5b6d-49b0-b34c-f61df33220a5-kube-api-access-cdl45\") pod \"2ec5a60c-5b6d-49b0-b34c-f61df33220a5\" (UID: \"2ec5a60c-5b6d-49b0-b34c-f61df33220a5\") " Dec 01 09:27:14 crc kubenswrapper[4867]: I1201 09:27:14.530629 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ec5a60c-5b6d-49b0-b34c-f61df33220a5-operator-scripts\") pod \"2ec5a60c-5b6d-49b0-b34c-f61df33220a5\" (UID: \"2ec5a60c-5b6d-49b0-b34c-f61df33220a5\") " Dec 01 09:27:14 crc kubenswrapper[4867]: I1201 09:27:14.532105 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ec5a60c-5b6d-49b0-b34c-f61df33220a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ec5a60c-5b6d-49b0-b34c-f61df33220a5" (UID: "2ec5a60c-5b6d-49b0-b34c-f61df33220a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:14 crc kubenswrapper[4867]: I1201 09:27:14.543832 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ec5a60c-5b6d-49b0-b34c-f61df33220a5-kube-api-access-cdl45" (OuterVolumeSpecName: "kube-api-access-cdl45") pod "2ec5a60c-5b6d-49b0-b34c-f61df33220a5" (UID: "2ec5a60c-5b6d-49b0-b34c-f61df33220a5"). InnerVolumeSpecName "kube-api-access-cdl45". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:14 crc kubenswrapper[4867]: I1201 09:27:14.633328 4867 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ec5a60c-5b6d-49b0-b34c-f61df33220a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:14 crc kubenswrapper[4867]: I1201 09:27:14.633360 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdl45\" (UniqueName: \"kubernetes.io/projected/2ec5a60c-5b6d-49b0-b34c-f61df33220a5-kube-api-access-cdl45\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.055072 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vhjsc" event={"ID":"2ec5a60c-5b6d-49b0-b34c-f61df33220a5","Type":"ContainerDied","Data":"5aa2c08a0d3b5ab02a25396f40ca406bd4b17d9743d7bbfc1bda823fa2987122"} Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.055177 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vhjsc" Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.055458 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5aa2c08a0d3b5ab02a25396f40ca406bd4b17d9743d7bbfc1bda823fa2987122" Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.449371 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5e76-account-create-update-8gv64" Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.521522 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.549887 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3baefa6f-f6ea-43ce-978c-dcd5be45de35-operator-scripts\") pod \"3baefa6f-f6ea-43ce-978c-dcd5be45de35\" (UID: \"3baefa6f-f6ea-43ce-978c-dcd5be45de35\") " Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.549930 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl768\" (UniqueName: \"kubernetes.io/projected/3baefa6f-f6ea-43ce-978c-dcd5be45de35-kube-api-access-fl768\") pod \"3baefa6f-f6ea-43ce-978c-dcd5be45de35\" (UID: \"3baefa6f-f6ea-43ce-978c-dcd5be45de35\") " Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.551567 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3baefa6f-f6ea-43ce-978c-dcd5be45de35-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3baefa6f-f6ea-43ce-978c-dcd5be45de35" (UID: "3baefa6f-f6ea-43ce-978c-dcd5be45de35"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.560615 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3baefa6f-f6ea-43ce-978c-dcd5be45de35-kube-api-access-fl768" (OuterVolumeSpecName: "kube-api-access-fl768") pod "3baefa6f-f6ea-43ce-978c-dcd5be45de35" (UID: "3baefa6f-f6ea-43ce-978c-dcd5be45de35"). InnerVolumeSpecName "kube-api-access-fl768". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.611479 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vnj5m" Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.628875 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c9d7-account-create-update-sqvdl" Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.651529 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl768\" (UniqueName: \"kubernetes.io/projected/3baefa6f-f6ea-43ce-978c-dcd5be45de35-kube-api-access-fl768\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.651564 4867 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3baefa6f-f6ea-43ce-978c-dcd5be45de35-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.754166 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptlm9\" (UniqueName: \"kubernetes.io/projected/10286952-3989-4e1b-ab98-2971420319da-kube-api-access-ptlm9\") pod \"10286952-3989-4e1b-ab98-2971420319da\" (UID: \"10286952-3989-4e1b-ab98-2971420319da\") " Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.755113 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10286952-3989-4e1b-ab98-2971420319da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "10286952-3989-4e1b-ab98-2971420319da" (UID: "10286952-3989-4e1b-ab98-2971420319da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.755294 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10286952-3989-4e1b-ab98-2971420319da-operator-scripts\") pod \"10286952-3989-4e1b-ab98-2971420319da\" (UID: \"10286952-3989-4e1b-ab98-2971420319da\") " Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.755423 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8228e65b-ce56-48da-b9c5-770632f03a1c-operator-scripts\") pod \"8228e65b-ce56-48da-b9c5-770632f03a1c\" (UID: \"8228e65b-ce56-48da-b9c5-770632f03a1c\") " Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.755478 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkjf4\" (UniqueName: \"kubernetes.io/projected/8228e65b-ce56-48da-b9c5-770632f03a1c-kube-api-access-gkjf4\") pod \"8228e65b-ce56-48da-b9c5-770632f03a1c\" (UID: \"8228e65b-ce56-48da-b9c5-770632f03a1c\") " Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.755875 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8228e65b-ce56-48da-b9c5-770632f03a1c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8228e65b-ce56-48da-b9c5-770632f03a1c" (UID: "8228e65b-ce56-48da-b9c5-770632f03a1c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.755916 4867 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10286952-3989-4e1b-ab98-2971420319da-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.758067 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10286952-3989-4e1b-ab98-2971420319da-kube-api-access-ptlm9" (OuterVolumeSpecName: "kube-api-access-ptlm9") pod "10286952-3989-4e1b-ab98-2971420319da" (UID: "10286952-3989-4e1b-ab98-2971420319da"). InnerVolumeSpecName "kube-api-access-ptlm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.758489 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8228e65b-ce56-48da-b9c5-770632f03a1c-kube-api-access-gkjf4" (OuterVolumeSpecName: "kube-api-access-gkjf4") pod "8228e65b-ce56-48da-b9c5-770632f03a1c" (UID: "8228e65b-ce56-48da-b9c5-770632f03a1c"). InnerVolumeSpecName "kube-api-access-gkjf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.857948 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptlm9\" (UniqueName: \"kubernetes.io/projected/10286952-3989-4e1b-ab98-2971420319da-kube-api-access-ptlm9\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.857982 4867 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8228e65b-ce56-48da-b9c5-770632f03a1c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:15 crc kubenswrapper[4867]: I1201 09:27:15.857991 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkjf4\" (UniqueName: \"kubernetes.io/projected/8228e65b-ce56-48da-b9c5-770632f03a1c-kube-api-access-gkjf4\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:16 crc kubenswrapper[4867]: I1201 09:27:16.082456 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c9d7-account-create-update-sqvdl" event={"ID":"8228e65b-ce56-48da-b9c5-770632f03a1c","Type":"ContainerDied","Data":"dde6eabc758ea1e3a3418dee3c111f40a7ef04b134ae7bd1d92598b99ec13c76"} Dec 01 09:27:16 crc kubenswrapper[4867]: I1201 09:27:16.082506 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dde6eabc758ea1e3a3418dee3c111f40a7ef04b134ae7bd1d92598b99ec13c76" Dec 01 09:27:16 crc kubenswrapper[4867]: I1201 09:27:16.082603 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c9d7-account-create-update-sqvdl" Dec 01 09:27:16 crc kubenswrapper[4867]: I1201 09:27:16.085967 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vnj5m" event={"ID":"10286952-3989-4e1b-ab98-2971420319da","Type":"ContainerDied","Data":"d544f49991a89273de43a354d5c6c23b3a6882462b7a2624c278f3bf7b90bcb8"} Dec 01 09:27:16 crc kubenswrapper[4867]: I1201 09:27:16.085998 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d544f49991a89273de43a354d5c6c23b3a6882462b7a2624c278f3bf7b90bcb8" Dec 01 09:27:16 crc kubenswrapper[4867]: I1201 09:27:16.086066 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vnj5m" Dec 01 09:27:16 crc kubenswrapper[4867]: I1201 09:27:16.092172 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5e76-account-create-update-8gv64" event={"ID":"3baefa6f-f6ea-43ce-978c-dcd5be45de35","Type":"ContainerDied","Data":"7299b95dc40247f75bd279099018e2a187b8f0bd94d98d0f34b012faa1638e35"} Dec 01 09:27:16 crc kubenswrapper[4867]: I1201 09:27:16.094601 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7299b95dc40247f75bd279099018e2a187b8f0bd94d98d0f34b012faa1638e35" Dec 01 09:27:16 crc kubenswrapper[4867]: I1201 09:27:16.094912 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5e76-account-create-update-8gv64" Dec 01 09:27:18 crc kubenswrapper[4867]: I1201 09:27:18.202911 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vmh2x" podUID="aa810b5f-4cad-40cc-9feb-6afc38b56ab1" containerName="ovn-controller" probeResult="failure" output=< Dec 01 09:27:18 crc kubenswrapper[4867]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 01 09:27:18 crc kubenswrapper[4867]: > Dec 01 09:27:21 crc kubenswrapper[4867]: I1201 09:27:21.133267 4867 generic.go:334] "Generic (PLEG): container finished" podID="63bff526-5063-4326-8b3c-0c580320be58" containerID="bc1fcb2e09f9f9eb3b3ed5dd075ca24185a23aff2a86d37d58acb4a60948ed1f" exitCode=0 Dec 01 09:27:21 crc kubenswrapper[4867]: I1201 09:27:21.133466 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63bff526-5063-4326-8b3c-0c580320be58","Type":"ContainerDied","Data":"bc1fcb2e09f9f9eb3b3ed5dd075ca24185a23aff2a86d37d58acb4a60948ed1f"} Dec 01 09:27:21 crc kubenswrapper[4867]: I1201 09:27:21.136909 4867 generic.go:334] "Generic (PLEG): container finished" podID="7f260d89-a8a0-4d49-a34a-a36a06ef2eee" containerID="b266880f24fcd0138fc752fed1bbccea596621df22b09f8f05090abd3f8acacc" exitCode=0 Dec 01 09:27:21 crc kubenswrapper[4867]: I1201 09:27:21.136939 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7f260d89-a8a0-4d49-a34a-a36a06ef2eee","Type":"ContainerDied","Data":"b266880f24fcd0138fc752fed1bbccea596621df22b09f8f05090abd3f8acacc"} Dec 01 09:27:23 crc kubenswrapper[4867]: I1201 09:27:23.205720 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vmh2x" podUID="aa810b5f-4cad-40cc-9feb-6afc38b56ab1" containerName="ovn-controller" probeResult="failure" output=< Dec 01 09:27:23 crc kubenswrapper[4867]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 01 09:27:23 crc kubenswrapper[4867]: > Dec 01 09:27:23 crc kubenswrapper[4867]: I1201 09:27:23.249479 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:27:26 crc kubenswrapper[4867]: I1201 09:27:26.024857 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:27:26 crc kubenswrapper[4867]: I1201 09:27:26.030151 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3-etc-swift\") pod \"swift-storage-0\" (UID: \"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3\") " pod="openstack/swift-storage-0" Dec 01 09:27:26 crc kubenswrapper[4867]: I1201 09:27:26.181863 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 01 09:27:26 crc kubenswrapper[4867]: I1201 09:27:26.777784 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 01 09:27:27 crc kubenswrapper[4867]: I1201 09:27:27.200703 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jx8tw" event={"ID":"1964493b-eb78-487f-8210-3f6323e55583","Type":"ContainerStarted","Data":"f0bc2379a1b2bab52c22ad998eeca13270df75d0a8c16184cb6d038832e7e045"} Dec 01 09:27:27 crc kubenswrapper[4867]: I1201 09:27:27.204160 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63bff526-5063-4326-8b3c-0c580320be58","Type":"ContainerStarted","Data":"16c98f7f1a0f460a30b2b6c3277e0b19bae4407f075f2fdee48433b2196b74b5"} Dec 01 09:27:27 crc kubenswrapper[4867]: I1201 09:27:27.204774 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 01 09:27:27 crc kubenswrapper[4867]: I1201 09:27:27.206736 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3","Type":"ContainerStarted","Data":"d80abcc1999fa3505d2ea8ce7669a59ee57ee24fc2e2a1ef2f02e90fb85c052e"} Dec 01 09:27:27 crc kubenswrapper[4867]: I1201 09:27:27.208519 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7f260d89-a8a0-4d49-a34a-a36a06ef2eee","Type":"ContainerStarted","Data":"5571874b5306501064f0c8bfadaba2a8c63eee6455e2deb24d2e4b2a403ca008"} Dec 01 09:27:27 crc kubenswrapper[4867]: I1201 09:27:27.209178 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:27:27 crc kubenswrapper[4867]: I1201 09:27:27.222415 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-jx8tw" podStartSLOduration=2.872397764 podStartE2EDuration="16.22239977s" podCreationTimestamp="2025-12-01 09:27:11 +0000 UTC" firstStartedPulling="2025-12-01 09:27:12.617869431 +0000 UTC m=+1154.077256185" lastFinishedPulling="2025-12-01 09:27:25.967871427 +0000 UTC m=+1167.427258191" observedRunningTime="2025-12-01 09:27:27.21584693 +0000 UTC m=+1168.675233684" watchObservedRunningTime="2025-12-01 09:27:27.22239977 +0000 UTC m=+1168.681786524" Dec 01 09:27:27 crc kubenswrapper[4867]: I1201 09:27:27.261105 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=42.34147797 podStartE2EDuration="1m21.261085322s" podCreationTimestamp="2025-12-01 09:26:06 +0000 UTC" firstStartedPulling="2025-12-01 09:26:08.799131596 +0000 UTC m=+1090.258518350" lastFinishedPulling="2025-12-01 09:26:47.718738948 +0000 UTC m=+1129.178125702" observedRunningTime="2025-12-01 09:27:27.245427323 +0000 UTC m=+1168.704814097" watchObservedRunningTime="2025-12-01 09:27:27.261085322 +0000 UTC m=+1168.720472086" Dec 01 09:27:27 crc kubenswrapper[4867]: I1201 09:27:27.288133 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=41.973491145 podStartE2EDuration="1m21.288112655s" podCreationTimestamp="2025-12-01 09:26:06 +0000 UTC" firstStartedPulling="2025-12-01 09:26:08.404769926 +0000 UTC m=+1089.864156680" lastFinishedPulling="2025-12-01 09:26:47.719391436 +0000 UTC m=+1129.178778190" observedRunningTime="2025-12-01 09:27:27.269173144 +0000 UTC m=+1168.728559908" watchObservedRunningTime="2025-12-01 09:27:27.288112655 +0000 UTC m=+1168.747499409" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.207643 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vmh2x" podUID="aa810b5f-4cad-40cc-9feb-6afc38b56ab1" containerName="ovn-controller" probeResult="failure" output=< Dec 01 09:27:28 crc kubenswrapper[4867]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 01 09:27:28 crc kubenswrapper[4867]: > Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.278390 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9jsgc" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.490776 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vmh2x-config-x86wm"] Dec 01 09:27:28 crc kubenswrapper[4867]: E1201 09:27:28.491178 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3baefa6f-f6ea-43ce-978c-dcd5be45de35" containerName="mariadb-account-create-update" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.491202 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3baefa6f-f6ea-43ce-978c-dcd5be45de35" containerName="mariadb-account-create-update" Dec 01 09:27:28 crc kubenswrapper[4867]: E1201 09:27:28.491219 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14b301a3-7288-471a-8ca4-cb7f4dca4b96" containerName="swift-ring-rebalance" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.491227 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="14b301a3-7288-471a-8ca4-cb7f4dca4b96" containerName="swift-ring-rebalance" Dec 01 09:27:28 crc kubenswrapper[4867]: E1201 09:27:28.491243 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ec5a60c-5b6d-49b0-b34c-f61df33220a5" containerName="mariadb-database-create" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.491252 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ec5a60c-5b6d-49b0-b34c-f61df33220a5" containerName="mariadb-database-create" Dec 01 09:27:28 crc kubenswrapper[4867]: E1201 09:27:28.491286 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10286952-3989-4e1b-ab98-2971420319da" containerName="mariadb-database-create" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.491294 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="10286952-3989-4e1b-ab98-2971420319da" containerName="mariadb-database-create" Dec 01 09:27:28 crc kubenswrapper[4867]: E1201 09:27:28.491309 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8228e65b-ce56-48da-b9c5-770632f03a1c" containerName="mariadb-account-create-update" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.491318 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8228e65b-ce56-48da-b9c5-770632f03a1c" containerName="mariadb-account-create-update" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.491533 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="10286952-3989-4e1b-ab98-2971420319da" containerName="mariadb-database-create" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.491554 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ec5a60c-5b6d-49b0-b34c-f61df33220a5" containerName="mariadb-database-create" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.491566 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="14b301a3-7288-471a-8ca4-cb7f4dca4b96" containerName="swift-ring-rebalance" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.491582 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8228e65b-ce56-48da-b9c5-770632f03a1c" containerName="mariadb-account-create-update" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.491594 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="3baefa6f-f6ea-43ce-978c-dcd5be45de35" containerName="mariadb-account-create-update" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.492273 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.504831 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.510105 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vmh2x-config-x86wm"] Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.569450 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/15b83cb3-8307-4cb1-9f5c-a0280b19445e-var-run-ovn\") pod \"ovn-controller-vmh2x-config-x86wm\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.569510 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/15b83cb3-8307-4cb1-9f5c-a0280b19445e-additional-scripts\") pod \"ovn-controller-vmh2x-config-x86wm\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.569571 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15b83cb3-8307-4cb1-9f5c-a0280b19445e-scripts\") pod \"ovn-controller-vmh2x-config-x86wm\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.569594 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/15b83cb3-8307-4cb1-9f5c-a0280b19445e-var-run\") pod \"ovn-controller-vmh2x-config-x86wm\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.569628 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/15b83cb3-8307-4cb1-9f5c-a0280b19445e-var-log-ovn\") pod \"ovn-controller-vmh2x-config-x86wm\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.569664 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb8vk\" (UniqueName: \"kubernetes.io/projected/15b83cb3-8307-4cb1-9f5c-a0280b19445e-kube-api-access-cb8vk\") pod \"ovn-controller-vmh2x-config-x86wm\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.672264 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb8vk\" (UniqueName: \"kubernetes.io/projected/15b83cb3-8307-4cb1-9f5c-a0280b19445e-kube-api-access-cb8vk\") pod \"ovn-controller-vmh2x-config-x86wm\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.672535 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/15b83cb3-8307-4cb1-9f5c-a0280b19445e-var-run-ovn\") pod \"ovn-controller-vmh2x-config-x86wm\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.672669 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/15b83cb3-8307-4cb1-9f5c-a0280b19445e-additional-scripts\") pod \"ovn-controller-vmh2x-config-x86wm\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.672791 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15b83cb3-8307-4cb1-9f5c-a0280b19445e-scripts\") pod \"ovn-controller-vmh2x-config-x86wm\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.672887 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/15b83cb3-8307-4cb1-9f5c-a0280b19445e-var-run\") pod \"ovn-controller-vmh2x-config-x86wm\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.673005 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/15b83cb3-8307-4cb1-9f5c-a0280b19445e-var-log-ovn\") pod \"ovn-controller-vmh2x-config-x86wm\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.673201 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/15b83cb3-8307-4cb1-9f5c-a0280b19445e-var-log-ovn\") pod \"ovn-controller-vmh2x-config-x86wm\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.673298 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/15b83cb3-8307-4cb1-9f5c-a0280b19445e-var-run\") pod \"ovn-controller-vmh2x-config-x86wm\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.672897 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/15b83cb3-8307-4cb1-9f5c-a0280b19445e-var-run-ovn\") pod \"ovn-controller-vmh2x-config-x86wm\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.673508 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/15b83cb3-8307-4cb1-9f5c-a0280b19445e-additional-scripts\") pod \"ovn-controller-vmh2x-config-x86wm\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.675766 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15b83cb3-8307-4cb1-9f5c-a0280b19445e-scripts\") pod \"ovn-controller-vmh2x-config-x86wm\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.694524 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb8vk\" (UniqueName: \"kubernetes.io/projected/15b83cb3-8307-4cb1-9f5c-a0280b19445e-kube-api-access-cb8vk\") pod \"ovn-controller-vmh2x-config-x86wm\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:28 crc kubenswrapper[4867]: I1201 09:27:28.815096 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:29 crc kubenswrapper[4867]: I1201 09:27:29.229217 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3","Type":"ContainerStarted","Data":"981076384e674cbfcfb347f39966a0f1fdf2e490e112b6e424db0142d0ffdcec"} Dec 01 09:27:29 crc kubenswrapper[4867]: I1201 09:27:29.229849 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3","Type":"ContainerStarted","Data":"135450ab3762a0d8019a6e513cc3937cb82dfa90acc3b630723ee98a4cc67b5f"} Dec 01 09:27:29 crc kubenswrapper[4867]: I1201 09:27:29.318565 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vmh2x-config-x86wm"] Dec 01 09:27:29 crc kubenswrapper[4867]: W1201 09:27:29.354364 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15b83cb3_8307_4cb1_9f5c_a0280b19445e.slice/crio-6c343bf9c589f23396796697ebe2973a326cf6fe6c1db581d0238956874fd0f0 WatchSource:0}: Error finding container 6c343bf9c589f23396796697ebe2973a326cf6fe6c1db581d0238956874fd0f0: Status 404 returned error can't find the container with id 6c343bf9c589f23396796697ebe2973a326cf6fe6c1db581d0238956874fd0f0 Dec 01 09:27:30 crc kubenswrapper[4867]: I1201 09:27:30.240027 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3","Type":"ContainerStarted","Data":"f8505f899c677cd8a32c5787e526781f36dc5678242eaf621330fc1726520028"} Dec 01 09:27:30 crc kubenswrapper[4867]: I1201 09:27:30.240375 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3","Type":"ContainerStarted","Data":"7f4fd79b7661ab3d9a9c7d4b1f0a6b84e30f04331bbfc931904aea10179982f4"} Dec 01 09:27:30 crc kubenswrapper[4867]: I1201 09:27:30.242201 4867 generic.go:334] "Generic (PLEG): container finished" podID="15b83cb3-8307-4cb1-9f5c-a0280b19445e" containerID="7e00294f2f5021f38a5460df958b260deab9be5063fbc31d2cf6db4058f6c3b8" exitCode=0 Dec 01 09:27:30 crc kubenswrapper[4867]: I1201 09:27:30.242240 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vmh2x-config-x86wm" event={"ID":"15b83cb3-8307-4cb1-9f5c-a0280b19445e","Type":"ContainerDied","Data":"7e00294f2f5021f38a5460df958b260deab9be5063fbc31d2cf6db4058f6c3b8"} Dec 01 09:27:30 crc kubenswrapper[4867]: I1201 09:27:30.242263 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vmh2x-config-x86wm" event={"ID":"15b83cb3-8307-4cb1-9f5c-a0280b19445e","Type":"ContainerStarted","Data":"6c343bf9c589f23396796697ebe2973a326cf6fe6c1db581d0238956874fd0f0"} Dec 01 09:27:31 crc kubenswrapper[4867]: I1201 09:27:31.257643 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3","Type":"ContainerStarted","Data":"6fc55d8ecb951158931c05e6a827dfed61848fdd8f0f6363d1bdf32342e850f5"} Dec 01 09:27:31 crc kubenswrapper[4867]: I1201 09:27:31.618827 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:31 crc kubenswrapper[4867]: I1201 09:27:31.730619 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/15b83cb3-8307-4cb1-9f5c-a0280b19445e-var-run\") pod \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " Dec 01 09:27:31 crc kubenswrapper[4867]: I1201 09:27:31.730771 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/15b83cb3-8307-4cb1-9f5c-a0280b19445e-var-log-ovn\") pod \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " Dec 01 09:27:31 crc kubenswrapper[4867]: I1201 09:27:31.730953 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/15b83cb3-8307-4cb1-9f5c-a0280b19445e-var-run-ovn\") pod \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " Dec 01 09:27:31 crc kubenswrapper[4867]: I1201 09:27:31.731057 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/15b83cb3-8307-4cb1-9f5c-a0280b19445e-additional-scripts\") pod \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " Dec 01 09:27:31 crc kubenswrapper[4867]: I1201 09:27:31.731328 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15b83cb3-8307-4cb1-9f5c-a0280b19445e-scripts\") pod \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " Dec 01 09:27:31 crc kubenswrapper[4867]: I1201 09:27:31.731425 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb8vk\" (UniqueName: \"kubernetes.io/projected/15b83cb3-8307-4cb1-9f5c-a0280b19445e-kube-api-access-cb8vk\") pod \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\" (UID: \"15b83cb3-8307-4cb1-9f5c-a0280b19445e\") " Dec 01 09:27:31 crc kubenswrapper[4867]: I1201 09:27:31.732949 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15b83cb3-8307-4cb1-9f5c-a0280b19445e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "15b83cb3-8307-4cb1-9f5c-a0280b19445e" (UID: "15b83cb3-8307-4cb1-9f5c-a0280b19445e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:27:31 crc kubenswrapper[4867]: I1201 09:27:31.734376 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15b83cb3-8307-4cb1-9f5c-a0280b19445e-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "15b83cb3-8307-4cb1-9f5c-a0280b19445e" (UID: "15b83cb3-8307-4cb1-9f5c-a0280b19445e"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:31 crc kubenswrapper[4867]: I1201 09:27:31.734410 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15b83cb3-8307-4cb1-9f5c-a0280b19445e-var-run" (OuterVolumeSpecName: "var-run") pod "15b83cb3-8307-4cb1-9f5c-a0280b19445e" (UID: "15b83cb3-8307-4cb1-9f5c-a0280b19445e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:27:31 crc kubenswrapper[4867]: I1201 09:27:31.734429 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15b83cb3-8307-4cb1-9f5c-a0280b19445e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "15b83cb3-8307-4cb1-9f5c-a0280b19445e" (UID: "15b83cb3-8307-4cb1-9f5c-a0280b19445e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:27:31 crc kubenswrapper[4867]: I1201 09:27:31.735010 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15b83cb3-8307-4cb1-9f5c-a0280b19445e-scripts" (OuterVolumeSpecName: "scripts") pod "15b83cb3-8307-4cb1-9f5c-a0280b19445e" (UID: "15b83cb3-8307-4cb1-9f5c-a0280b19445e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:31 crc kubenswrapper[4867]: I1201 09:27:31.740981 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15b83cb3-8307-4cb1-9f5c-a0280b19445e-kube-api-access-cb8vk" (OuterVolumeSpecName: "kube-api-access-cb8vk") pod "15b83cb3-8307-4cb1-9f5c-a0280b19445e" (UID: "15b83cb3-8307-4cb1-9f5c-a0280b19445e"). InnerVolumeSpecName "kube-api-access-cb8vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:31 crc kubenswrapper[4867]: I1201 09:27:31.833945 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15b83cb3-8307-4cb1-9f5c-a0280b19445e-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:31 crc kubenswrapper[4867]: I1201 09:27:31.834305 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb8vk\" (UniqueName: \"kubernetes.io/projected/15b83cb3-8307-4cb1-9f5c-a0280b19445e-kube-api-access-cb8vk\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:31 crc kubenswrapper[4867]: I1201 09:27:31.834379 4867 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/15b83cb3-8307-4cb1-9f5c-a0280b19445e-var-run\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:31 crc kubenswrapper[4867]: I1201 09:27:31.834444 4867 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/15b83cb3-8307-4cb1-9f5c-a0280b19445e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:31 crc kubenswrapper[4867]: I1201 09:27:31.834504 4867 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/15b83cb3-8307-4cb1-9f5c-a0280b19445e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:31 crc kubenswrapper[4867]: I1201 09:27:31.834603 4867 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/15b83cb3-8307-4cb1-9f5c-a0280b19445e-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:32 crc kubenswrapper[4867]: I1201 09:27:32.270687 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3","Type":"ContainerStarted","Data":"80df666becf8c031b68f96ecc1b66a58880fb17adba12289fe075dcac52d7f13"} Dec 01 09:27:32 crc kubenswrapper[4867]: I1201 09:27:32.270751 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3","Type":"ContainerStarted","Data":"c67e529f7f3a685f026136852216b9f2ecc51379814ae2f085e186fe22f4bb61"} Dec 01 09:27:32 crc kubenswrapper[4867]: I1201 09:27:32.270765 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3","Type":"ContainerStarted","Data":"80db071b347a80afd64f624f5ef00e931e316d7018f69ab15df385418f45cc90"} Dec 01 09:27:32 crc kubenswrapper[4867]: I1201 09:27:32.272420 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vmh2x-config-x86wm" event={"ID":"15b83cb3-8307-4cb1-9f5c-a0280b19445e","Type":"ContainerDied","Data":"6c343bf9c589f23396796697ebe2973a326cf6fe6c1db581d0238956874fd0f0"} Dec 01 09:27:32 crc kubenswrapper[4867]: I1201 09:27:32.272534 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c343bf9c589f23396796697ebe2973a326cf6fe6c1db581d0238956874fd0f0" Dec 01 09:27:32 crc kubenswrapper[4867]: I1201 09:27:32.272491 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vmh2x-config-x86wm" Dec 01 09:27:32 crc kubenswrapper[4867]: I1201 09:27:32.728901 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-vmh2x-config-x86wm"] Dec 01 09:27:32 crc kubenswrapper[4867]: I1201 09:27:32.737295 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-vmh2x-config-x86wm"] Dec 01 09:27:32 crc kubenswrapper[4867]: I1201 09:27:32.838947 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15b83cb3-8307-4cb1-9f5c-a0280b19445e" path="/var/lib/kubelet/pods/15b83cb3-8307-4cb1-9f5c-a0280b19445e/volumes" Dec 01 09:27:33 crc kubenswrapper[4867]: I1201 09:27:33.221555 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-vmh2x" Dec 01 09:27:35 crc kubenswrapper[4867]: I1201 09:27:35.302635 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3","Type":"ContainerStarted","Data":"8405bdbf2baf57c694195c3c3d1ef46591e70e9ed70b34a539ae05d5e3cc0d5f"} Dec 01 09:27:36 crc kubenswrapper[4867]: I1201 09:27:36.317380 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3","Type":"ContainerStarted","Data":"c634b2c23fe791c51a8852ddcd5101d947f50a5c00e8859984d341173f791d61"} Dec 01 09:27:36 crc kubenswrapper[4867]: I1201 09:27:36.317728 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3","Type":"ContainerStarted","Data":"3d3608528f9fb3e236ad9d0a79ab083fd0919e44448dddbc1035906ceaf40a5e"} Dec 01 09:27:36 crc kubenswrapper[4867]: I1201 09:27:36.317743 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3","Type":"ContainerStarted","Data":"d64cbfbfc92ea7c431d28a6d1460b71f6caf8f0ed815a965300c4aec5c20f276"} Dec 01 09:27:37 crc kubenswrapper[4867]: I1201 09:27:37.332125 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3","Type":"ContainerStarted","Data":"71f561304ad72d670ae364407d0201c3ee6af5e93993b7ea1ad401ffec4c615d"} Dec 01 09:27:37 crc kubenswrapper[4867]: I1201 09:27:37.332176 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3","Type":"ContainerStarted","Data":"94292da1ee5a8d38290566428857659f1ea7fe33edf80c6dd18f181befaeacb9"} Dec 01 09:27:37 crc kubenswrapper[4867]: I1201 09:27:37.722093 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="63bff526-5063-4326-8b3c-0c580320be58" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.134045 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.349126 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3","Type":"ContainerStarted","Data":"7e7f56a7f8173b2ee8632aaf4de4318bdcfbc9864d4a8150b5629181a5c8566b"} Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.353033 4867 generic.go:334] "Generic (PLEG): container finished" podID="1964493b-eb78-487f-8210-3f6323e55583" containerID="f0bc2379a1b2bab52c22ad998eeca13270df75d0a8c16184cb6d038832e7e045" exitCode=0 Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.353085 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jx8tw" event={"ID":"1964493b-eb78-487f-8210-3f6323e55583","Type":"ContainerDied","Data":"f0bc2379a1b2bab52c22ad998eeca13270df75d0a8c16184cb6d038832e7e045"} Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.387091 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.097029797 podStartE2EDuration="45.38707398s" podCreationTimestamp="2025-12-01 09:26:53 +0000 UTC" firstStartedPulling="2025-12-01 09:27:26.782504739 +0000 UTC m=+1168.241891493" lastFinishedPulling="2025-12-01 09:27:35.072548922 +0000 UTC m=+1176.531935676" observedRunningTime="2025-12-01 09:27:38.380605222 +0000 UTC m=+1179.839991986" watchObservedRunningTime="2025-12-01 09:27:38.38707398 +0000 UTC m=+1179.846460734" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.670173 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-lm952"] Dec 01 09:27:38 crc kubenswrapper[4867]: E1201 09:27:38.670678 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15b83cb3-8307-4cb1-9f5c-a0280b19445e" containerName="ovn-config" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.670698 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="15b83cb3-8307-4cb1-9f5c-a0280b19445e" containerName="ovn-config" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.670962 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="15b83cb3-8307-4cb1-9f5c-a0280b19445e" containerName="ovn-config" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.672008 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.681298 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.701590 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-lm952"] Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.748138 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-lm952\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.748406 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9d9c\" (UniqueName: \"kubernetes.io/projected/f5ed80ec-e51c-4c09-9187-42e795fedbe8-kube-api-access-z9d9c\") pod \"dnsmasq-dns-764c5664d7-lm952\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.748551 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-config\") pod \"dnsmasq-dns-764c5664d7-lm952\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.748686 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-lm952\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.748782 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-lm952\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.748884 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-dns-svc\") pod \"dnsmasq-dns-764c5664d7-lm952\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.850818 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-lm952\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.850874 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9d9c\" (UniqueName: \"kubernetes.io/projected/f5ed80ec-e51c-4c09-9187-42e795fedbe8-kube-api-access-z9d9c\") pod \"dnsmasq-dns-764c5664d7-lm952\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.850901 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-config\") pod \"dnsmasq-dns-764c5664d7-lm952\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.850934 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-lm952\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.850967 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-lm952\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.850988 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-dns-svc\") pod \"dnsmasq-dns-764c5664d7-lm952\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.851718 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-lm952\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.852059 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-lm952\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.852111 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-dns-svc\") pod \"dnsmasq-dns-764c5664d7-lm952\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.852187 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-lm952\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.852379 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-config\") pod \"dnsmasq-dns-764c5664d7-lm952\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.882568 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9d9c\" (UniqueName: \"kubernetes.io/projected/f5ed80ec-e51c-4c09-9187-42e795fedbe8-kube-api-access-z9d9c\") pod \"dnsmasq-dns-764c5664d7-lm952\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:38 crc kubenswrapper[4867]: I1201 09:27:38.992590 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:39 crc kubenswrapper[4867]: I1201 09:27:39.573266 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-lm952"] Dec 01 09:27:39 crc kubenswrapper[4867]: W1201 09:27:39.581026 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5ed80ec_e51c_4c09_9187_42e795fedbe8.slice/crio-7d81aff916a7d02a42dc1343300305a3e77275f109b770b5223d6c305d7a9796 WatchSource:0}: Error finding container 7d81aff916a7d02a42dc1343300305a3e77275f109b770b5223d6c305d7a9796: Status 404 returned error can't find the container with id 7d81aff916a7d02a42dc1343300305a3e77275f109b770b5223d6c305d7a9796 Dec 01 09:27:39 crc kubenswrapper[4867]: I1201 09:27:39.783624 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jx8tw" Dec 01 09:27:39 crc kubenswrapper[4867]: I1201 09:27:39.869879 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-config-data\") pod \"1964493b-eb78-487f-8210-3f6323e55583\" (UID: \"1964493b-eb78-487f-8210-3f6323e55583\") " Dec 01 09:27:39 crc kubenswrapper[4867]: I1201 09:27:39.869942 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-combined-ca-bundle\") pod \"1964493b-eb78-487f-8210-3f6323e55583\" (UID: \"1964493b-eb78-487f-8210-3f6323e55583\") " Dec 01 09:27:39 crc kubenswrapper[4867]: I1201 09:27:39.870008 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-db-sync-config-data\") pod \"1964493b-eb78-487f-8210-3f6323e55583\" (UID: \"1964493b-eb78-487f-8210-3f6323e55583\") " Dec 01 09:27:39 crc kubenswrapper[4867]: I1201 09:27:39.870080 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t42kj\" (UniqueName: \"kubernetes.io/projected/1964493b-eb78-487f-8210-3f6323e55583-kube-api-access-t42kj\") pod \"1964493b-eb78-487f-8210-3f6323e55583\" (UID: \"1964493b-eb78-487f-8210-3f6323e55583\") " Dec 01 09:27:39 crc kubenswrapper[4867]: I1201 09:27:39.873540 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1964493b-eb78-487f-8210-3f6323e55583" (UID: "1964493b-eb78-487f-8210-3f6323e55583"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:27:39 crc kubenswrapper[4867]: I1201 09:27:39.873582 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1964493b-eb78-487f-8210-3f6323e55583-kube-api-access-t42kj" (OuterVolumeSpecName: "kube-api-access-t42kj") pod "1964493b-eb78-487f-8210-3f6323e55583" (UID: "1964493b-eb78-487f-8210-3f6323e55583"). InnerVolumeSpecName "kube-api-access-t42kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:39 crc kubenswrapper[4867]: E1201 09:27:39.947317 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-config-data podName:1964493b-eb78-487f-8210-3f6323e55583 nodeName:}" failed. No retries permitted until 2025-12-01 09:27:40.447289618 +0000 UTC m=+1181.906676372 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-config-data") pod "1964493b-eb78-487f-8210-3f6323e55583" (UID: "1964493b-eb78-487f-8210-3f6323e55583") : error deleting /var/lib/kubelet/pods/1964493b-eb78-487f-8210-3f6323e55583/volume-subpaths: remove /var/lib/kubelet/pods/1964493b-eb78-487f-8210-3f6323e55583/volume-subpaths: no such file or directory Dec 01 09:27:39 crc kubenswrapper[4867]: I1201 09:27:39.950727 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1964493b-eb78-487f-8210-3f6323e55583" (UID: "1964493b-eb78-487f-8210-3f6323e55583"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:27:39 crc kubenswrapper[4867]: I1201 09:27:39.972568 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:39 crc kubenswrapper[4867]: I1201 09:27:39.972789 4867 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:39 crc kubenswrapper[4867]: I1201 09:27:39.972869 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t42kj\" (UniqueName: \"kubernetes.io/projected/1964493b-eb78-487f-8210-3f6323e55583-kube-api-access-t42kj\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.372685 4867 generic.go:334] "Generic (PLEG): container finished" podID="f5ed80ec-e51c-4c09-9187-42e795fedbe8" containerID="c1584e4c53fa08d901f9232a467d9f1b2e0c506e4294cc1571682ccb62db7a26" exitCode=0 Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.372790 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-lm952" event={"ID":"f5ed80ec-e51c-4c09-9187-42e795fedbe8","Type":"ContainerDied","Data":"c1584e4c53fa08d901f9232a467d9f1b2e0c506e4294cc1571682ccb62db7a26"} Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.372854 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-lm952" event={"ID":"f5ed80ec-e51c-4c09-9187-42e795fedbe8","Type":"ContainerStarted","Data":"7d81aff916a7d02a42dc1343300305a3e77275f109b770b5223d6c305d7a9796"} Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.374511 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jx8tw" event={"ID":"1964493b-eb78-487f-8210-3f6323e55583","Type":"ContainerDied","Data":"5ed3a6831fe6cf1041a0d8c63047d225882c38c1180009c19a7c7cd99405633f"} Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.374547 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ed3a6831fe6cf1041a0d8c63047d225882c38c1180009c19a7c7cd99405633f" Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.374566 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jx8tw" Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.485980 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-config-data\") pod \"1964493b-eb78-487f-8210-3f6323e55583\" (UID: \"1964493b-eb78-487f-8210-3f6323e55583\") " Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.499066 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-config-data" (OuterVolumeSpecName: "config-data") pod "1964493b-eb78-487f-8210-3f6323e55583" (UID: "1964493b-eb78-487f-8210-3f6323e55583"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.626788 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1964493b-eb78-487f-8210-3f6323e55583-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.738328 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-lm952"] Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.777792 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-k9s48"] Dec 01 09:27:40 crc kubenswrapper[4867]: E1201 09:27:40.778149 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1964493b-eb78-487f-8210-3f6323e55583" containerName="glance-db-sync" Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.778162 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1964493b-eb78-487f-8210-3f6323e55583" containerName="glance-db-sync" Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.778347 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1964493b-eb78-487f-8210-3f6323e55583" containerName="glance-db-sync" Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.779264 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.802923 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-k9s48"] Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.934261 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-k9s48\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.934316 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-k9s48\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.934384 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxtvm\" (UniqueName: \"kubernetes.io/projected/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-kube-api-access-nxtvm\") pod \"dnsmasq-dns-74f6bcbc87-k9s48\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.934420 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-k9s48\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.934470 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-config\") pod \"dnsmasq-dns-74f6bcbc87-k9s48\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:40 crc kubenswrapper[4867]: I1201 09:27:40.934492 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-k9s48\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:41 crc kubenswrapper[4867]: I1201 09:27:41.035507 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-k9s48\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:41 crc kubenswrapper[4867]: I1201 09:27:41.035566 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-config\") pod \"dnsmasq-dns-74f6bcbc87-k9s48\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:41 crc kubenswrapper[4867]: I1201 09:27:41.035592 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-k9s48\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:41 crc kubenswrapper[4867]: I1201 09:27:41.035661 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-k9s48\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:41 crc kubenswrapper[4867]: I1201 09:27:41.035687 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-k9s48\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:41 crc kubenswrapper[4867]: I1201 09:27:41.035728 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxtvm\" (UniqueName: \"kubernetes.io/projected/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-kube-api-access-nxtvm\") pod \"dnsmasq-dns-74f6bcbc87-k9s48\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:41 crc kubenswrapper[4867]: I1201 09:27:41.036770 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-k9s48\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:41 crc kubenswrapper[4867]: I1201 09:27:41.036920 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-config\") pod \"dnsmasq-dns-74f6bcbc87-k9s48\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:41 crc kubenswrapper[4867]: I1201 09:27:41.037064 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-k9s48\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:41 crc kubenswrapper[4867]: I1201 09:27:41.037597 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-k9s48\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:41 crc kubenswrapper[4867]: I1201 09:27:41.037837 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-k9s48\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:41 crc kubenswrapper[4867]: I1201 09:27:41.058922 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxtvm\" (UniqueName: \"kubernetes.io/projected/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-kube-api-access-nxtvm\") pod \"dnsmasq-dns-74f6bcbc87-k9s48\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:41 crc kubenswrapper[4867]: I1201 09:27:41.107106 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:41 crc kubenswrapper[4867]: I1201 09:27:41.393041 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-lm952" event={"ID":"f5ed80ec-e51c-4c09-9187-42e795fedbe8","Type":"ContainerStarted","Data":"7a8b6be31e8b37b115b98256ecf96ab68073f9b5490da58acd74b90c75291c99"} Dec 01 09:27:41 crc kubenswrapper[4867]: I1201 09:27:41.393794 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:41 crc kubenswrapper[4867]: I1201 09:27:41.417455 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-lm952" podStartSLOduration=3.417431894 podStartE2EDuration="3.417431894s" podCreationTimestamp="2025-12-01 09:27:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:27:41.411868761 +0000 UTC m=+1182.871255515" watchObservedRunningTime="2025-12-01 09:27:41.417431894 +0000 UTC m=+1182.876818658" Dec 01 09:27:41 crc kubenswrapper[4867]: I1201 09:27:41.559956 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-k9s48"] Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.402891 4867 generic.go:334] "Generic (PLEG): container finished" podID="604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" containerID="465819b356223a4bd0035c262d41cfb3ae8635231e1dde776d2e5aabb5239c02" exitCode=0 Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.402969 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" event={"ID":"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a","Type":"ContainerDied","Data":"465819b356223a4bd0035c262d41cfb3ae8635231e1dde776d2e5aabb5239c02"} Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.403225 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" event={"ID":"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a","Type":"ContainerStarted","Data":"82dcdb9e11cc79c809628959b5150232686fd41007be8714f8d83c56e8a5f1f7"} Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.403330 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-lm952" podUID="f5ed80ec-e51c-4c09-9187-42e795fedbe8" containerName="dnsmasq-dns" containerID="cri-o://7a8b6be31e8b37b115b98256ecf96ab68073f9b5490da58acd74b90c75291c99" gracePeriod=10 Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.739265 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.866393 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-dns-swift-storage-0\") pod \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.866440 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-dns-svc\") pod \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.866460 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-ovsdbserver-nb\") pod \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.866522 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-config\") pod \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.866550 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9d9c\" (UniqueName: \"kubernetes.io/projected/f5ed80ec-e51c-4c09-9187-42e795fedbe8-kube-api-access-z9d9c\") pod \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.866634 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-ovsdbserver-sb\") pod \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\" (UID: \"f5ed80ec-e51c-4c09-9187-42e795fedbe8\") " Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.871619 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5ed80ec-e51c-4c09-9187-42e795fedbe8-kube-api-access-z9d9c" (OuterVolumeSpecName: "kube-api-access-z9d9c") pod "f5ed80ec-e51c-4c09-9187-42e795fedbe8" (UID: "f5ed80ec-e51c-4c09-9187-42e795fedbe8"). InnerVolumeSpecName "kube-api-access-z9d9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.911208 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f5ed80ec-e51c-4c09-9187-42e795fedbe8" (UID: "f5ed80ec-e51c-4c09-9187-42e795fedbe8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.918637 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f5ed80ec-e51c-4c09-9187-42e795fedbe8" (UID: "f5ed80ec-e51c-4c09-9187-42e795fedbe8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.921018 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-config" (OuterVolumeSpecName: "config") pod "f5ed80ec-e51c-4c09-9187-42e795fedbe8" (UID: "f5ed80ec-e51c-4c09-9187-42e795fedbe8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.923541 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f5ed80ec-e51c-4c09-9187-42e795fedbe8" (UID: "f5ed80ec-e51c-4c09-9187-42e795fedbe8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.941193 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f5ed80ec-e51c-4c09-9187-42e795fedbe8" (UID: "f5ed80ec-e51c-4c09-9187-42e795fedbe8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.968966 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.969270 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.969282 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.969294 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.969302 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5ed80ec-e51c-4c09-9187-42e795fedbe8-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:42 crc kubenswrapper[4867]: I1201 09:27:42.969311 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9d9c\" (UniqueName: \"kubernetes.io/projected/f5ed80ec-e51c-4c09-9187-42e795fedbe8-kube-api-access-z9d9c\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:43 crc kubenswrapper[4867]: I1201 09:27:43.414511 4867 generic.go:334] "Generic (PLEG): container finished" podID="f5ed80ec-e51c-4c09-9187-42e795fedbe8" containerID="7a8b6be31e8b37b115b98256ecf96ab68073f9b5490da58acd74b90c75291c99" exitCode=0 Dec 01 09:27:43 crc kubenswrapper[4867]: I1201 09:27:43.414586 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-lm952" Dec 01 09:27:43 crc kubenswrapper[4867]: I1201 09:27:43.414624 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-lm952" event={"ID":"f5ed80ec-e51c-4c09-9187-42e795fedbe8","Type":"ContainerDied","Data":"7a8b6be31e8b37b115b98256ecf96ab68073f9b5490da58acd74b90c75291c99"} Dec 01 09:27:43 crc kubenswrapper[4867]: I1201 09:27:43.414677 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-lm952" event={"ID":"f5ed80ec-e51c-4c09-9187-42e795fedbe8","Type":"ContainerDied","Data":"7d81aff916a7d02a42dc1343300305a3e77275f109b770b5223d6c305d7a9796"} Dec 01 09:27:43 crc kubenswrapper[4867]: I1201 09:27:43.414701 4867 scope.go:117] "RemoveContainer" containerID="7a8b6be31e8b37b115b98256ecf96ab68073f9b5490da58acd74b90c75291c99" Dec 01 09:27:43 crc kubenswrapper[4867]: I1201 09:27:43.417854 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" event={"ID":"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a","Type":"ContainerStarted","Data":"9adfed2bde346b21fa4adfc36b9773c70edefc16f3d71c0ef87af1899b53d9d2"} Dec 01 09:27:43 crc kubenswrapper[4867]: I1201 09:27:43.418047 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:43 crc kubenswrapper[4867]: I1201 09:27:43.442963 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" podStartSLOduration=3.442941721 podStartE2EDuration="3.442941721s" podCreationTimestamp="2025-12-01 09:27:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:27:43.436510353 +0000 UTC m=+1184.895897117" watchObservedRunningTime="2025-12-01 09:27:43.442941721 +0000 UTC m=+1184.902328475" Dec 01 09:27:43 crc kubenswrapper[4867]: I1201 09:27:43.451988 4867 scope.go:117] "RemoveContainer" containerID="c1584e4c53fa08d901f9232a467d9f1b2e0c506e4294cc1571682ccb62db7a26" Dec 01 09:27:43 crc kubenswrapper[4867]: I1201 09:27:43.464560 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-lm952"] Dec 01 09:27:43 crc kubenswrapper[4867]: I1201 09:27:43.473804 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-lm952"] Dec 01 09:27:43 crc kubenswrapper[4867]: I1201 09:27:43.484863 4867 scope.go:117] "RemoveContainer" containerID="7a8b6be31e8b37b115b98256ecf96ab68073f9b5490da58acd74b90c75291c99" Dec 01 09:27:43 crc kubenswrapper[4867]: E1201 09:27:43.485311 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a8b6be31e8b37b115b98256ecf96ab68073f9b5490da58acd74b90c75291c99\": container with ID starting with 7a8b6be31e8b37b115b98256ecf96ab68073f9b5490da58acd74b90c75291c99 not found: ID does not exist" containerID="7a8b6be31e8b37b115b98256ecf96ab68073f9b5490da58acd74b90c75291c99" Dec 01 09:27:43 crc kubenswrapper[4867]: I1201 09:27:43.485352 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a8b6be31e8b37b115b98256ecf96ab68073f9b5490da58acd74b90c75291c99"} err="failed to get container status \"7a8b6be31e8b37b115b98256ecf96ab68073f9b5490da58acd74b90c75291c99\": rpc error: code = NotFound desc = could not find container \"7a8b6be31e8b37b115b98256ecf96ab68073f9b5490da58acd74b90c75291c99\": container with ID starting with 7a8b6be31e8b37b115b98256ecf96ab68073f9b5490da58acd74b90c75291c99 not found: ID does not exist" Dec 01 09:27:43 crc kubenswrapper[4867]: I1201 09:27:43.485380 4867 scope.go:117] "RemoveContainer" containerID="c1584e4c53fa08d901f9232a467d9f1b2e0c506e4294cc1571682ccb62db7a26" Dec 01 09:27:43 crc kubenswrapper[4867]: E1201 09:27:43.485863 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1584e4c53fa08d901f9232a467d9f1b2e0c506e4294cc1571682ccb62db7a26\": container with ID starting with c1584e4c53fa08d901f9232a467d9f1b2e0c506e4294cc1571682ccb62db7a26 not found: ID does not exist" containerID="c1584e4c53fa08d901f9232a467d9f1b2e0c506e4294cc1571682ccb62db7a26" Dec 01 09:27:43 crc kubenswrapper[4867]: I1201 09:27:43.485895 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1584e4c53fa08d901f9232a467d9f1b2e0c506e4294cc1571682ccb62db7a26"} err="failed to get container status \"c1584e4c53fa08d901f9232a467d9f1b2e0c506e4294cc1571682ccb62db7a26\": rpc error: code = NotFound desc = could not find container \"c1584e4c53fa08d901f9232a467d9f1b2e0c506e4294cc1571682ccb62db7a26\": container with ID starting with c1584e4c53fa08d901f9232a467d9f1b2e0c506e4294cc1571682ccb62db7a26 not found: ID does not exist" Dec 01 09:27:44 crc kubenswrapper[4867]: I1201 09:27:44.835638 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5ed80ec-e51c-4c09-9187-42e795fedbe8" path="/var/lib/kubelet/pods/f5ed80ec-e51c-4c09-9187-42e795fedbe8/volumes" Dec 01 09:27:47 crc kubenswrapper[4867]: I1201 09:27:47.722119 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.134270 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-h82jj"] Dec 01 09:27:48 crc kubenswrapper[4867]: E1201 09:27:48.134676 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ed80ec-e51c-4c09-9187-42e795fedbe8" containerName="init" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.134703 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ed80ec-e51c-4c09-9187-42e795fedbe8" containerName="init" Dec 01 09:27:48 crc kubenswrapper[4867]: E1201 09:27:48.134735 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ed80ec-e51c-4c09-9187-42e795fedbe8" containerName="dnsmasq-dns" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.134744 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ed80ec-e51c-4c09-9187-42e795fedbe8" containerName="dnsmasq-dns" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.134997 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5ed80ec-e51c-4c09-9187-42e795fedbe8" containerName="dnsmasq-dns" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.135711 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h82jj" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.147547 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-d209-account-create-update-b55jg"] Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.148508 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d209-account-create-update-b55jg" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.155162 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.161666 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h82jj"] Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.173530 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d209-account-create-update-b55jg"] Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.240297 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-99g2f"] Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.241257 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-99g2f" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.257849 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fda04dfd-8163-458a-baa4-df9622a4f5c6-operator-scripts\") pod \"cinder-db-create-h82jj\" (UID: \"fda04dfd-8163-458a-baa4-df9622a4f5c6\") " pod="openstack/cinder-db-create-h82jj" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.257895 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfdb5\" (UniqueName: \"kubernetes.io/projected/8fe79080-dd3d-45d8-9929-030bb4eb72c3-kube-api-access-sfdb5\") pod \"cinder-d209-account-create-update-b55jg\" (UID: \"8fe79080-dd3d-45d8-9929-030bb4eb72c3\") " pod="openstack/cinder-d209-account-create-update-b55jg" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.257964 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpxx2\" (UniqueName: \"kubernetes.io/projected/fda04dfd-8163-458a-baa4-df9622a4f5c6-kube-api-access-tpxx2\") pod \"cinder-db-create-h82jj\" (UID: \"fda04dfd-8163-458a-baa4-df9622a4f5c6\") " pod="openstack/cinder-db-create-h82jj" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.258005 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fe79080-dd3d-45d8-9929-030bb4eb72c3-operator-scripts\") pod \"cinder-d209-account-create-update-b55jg\" (UID: \"8fe79080-dd3d-45d8-9929-030bb4eb72c3\") " pod="openstack/cinder-d209-account-create-update-b55jg" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.266158 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-99g2f"] Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.360263 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjfvg\" (UniqueName: \"kubernetes.io/projected/fedf8a1e-7645-4e0c-800a-e551181e5781-kube-api-access-bjfvg\") pod \"barbican-db-create-99g2f\" (UID: \"fedf8a1e-7645-4e0c-800a-e551181e5781\") " pod="openstack/barbican-db-create-99g2f" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.360339 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fda04dfd-8163-458a-baa4-df9622a4f5c6-operator-scripts\") pod \"cinder-db-create-h82jj\" (UID: \"fda04dfd-8163-458a-baa4-df9622a4f5c6\") " pod="openstack/cinder-db-create-h82jj" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.360368 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfdb5\" (UniqueName: \"kubernetes.io/projected/8fe79080-dd3d-45d8-9929-030bb4eb72c3-kube-api-access-sfdb5\") pod \"cinder-d209-account-create-update-b55jg\" (UID: \"8fe79080-dd3d-45d8-9929-030bb4eb72c3\") " pod="openstack/cinder-d209-account-create-update-b55jg" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.360419 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fedf8a1e-7645-4e0c-800a-e551181e5781-operator-scripts\") pod \"barbican-db-create-99g2f\" (UID: \"fedf8a1e-7645-4e0c-800a-e551181e5781\") " pod="openstack/barbican-db-create-99g2f" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.360453 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpxx2\" (UniqueName: \"kubernetes.io/projected/fda04dfd-8163-458a-baa4-df9622a4f5c6-kube-api-access-tpxx2\") pod \"cinder-db-create-h82jj\" (UID: \"fda04dfd-8163-458a-baa4-df9622a4f5c6\") " pod="openstack/cinder-db-create-h82jj" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.360489 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fe79080-dd3d-45d8-9929-030bb4eb72c3-operator-scripts\") pod \"cinder-d209-account-create-update-b55jg\" (UID: \"8fe79080-dd3d-45d8-9929-030bb4eb72c3\") " pod="openstack/cinder-d209-account-create-update-b55jg" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.361537 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fda04dfd-8163-458a-baa4-df9622a4f5c6-operator-scripts\") pod \"cinder-db-create-h82jj\" (UID: \"fda04dfd-8163-458a-baa4-df9622a4f5c6\") " pod="openstack/cinder-db-create-h82jj" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.361553 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fe79080-dd3d-45d8-9929-030bb4eb72c3-operator-scripts\") pod \"cinder-d209-account-create-update-b55jg\" (UID: \"8fe79080-dd3d-45d8-9929-030bb4eb72c3\") " pod="openstack/cinder-d209-account-create-update-b55jg" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.371643 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-92d3-account-create-update-4ttg6"] Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.372742 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-92d3-account-create-update-4ttg6" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.375374 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.396397 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfdb5\" (UniqueName: \"kubernetes.io/projected/8fe79080-dd3d-45d8-9929-030bb4eb72c3-kube-api-access-sfdb5\") pod \"cinder-d209-account-create-update-b55jg\" (UID: \"8fe79080-dd3d-45d8-9929-030bb4eb72c3\") " pod="openstack/cinder-d209-account-create-update-b55jg" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.400410 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpxx2\" (UniqueName: \"kubernetes.io/projected/fda04dfd-8163-458a-baa4-df9622a4f5c6-kube-api-access-tpxx2\") pod \"cinder-db-create-h82jj\" (UID: \"fda04dfd-8163-458a-baa4-df9622a4f5c6\") " pod="openstack/cinder-db-create-h82jj" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.401285 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-92d3-account-create-update-4ttg6"] Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.456624 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h82jj" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.462306 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fedf8a1e-7645-4e0c-800a-e551181e5781-operator-scripts\") pod \"barbican-db-create-99g2f\" (UID: \"fedf8a1e-7645-4e0c-800a-e551181e5781\") " pod="openstack/barbican-db-create-99g2f" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.462379 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbd0b6f8-d458-42e6-a07a-ba22d371037d-operator-scripts\") pod \"barbican-92d3-account-create-update-4ttg6\" (UID: \"bbd0b6f8-d458-42e6-a07a-ba22d371037d\") " pod="openstack/barbican-92d3-account-create-update-4ttg6" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.462432 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjfvg\" (UniqueName: \"kubernetes.io/projected/fedf8a1e-7645-4e0c-800a-e551181e5781-kube-api-access-bjfvg\") pod \"barbican-db-create-99g2f\" (UID: \"fedf8a1e-7645-4e0c-800a-e551181e5781\") " pod="openstack/barbican-db-create-99g2f" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.462490 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhbht\" (UniqueName: \"kubernetes.io/projected/bbd0b6f8-d458-42e6-a07a-ba22d371037d-kube-api-access-xhbht\") pod \"barbican-92d3-account-create-update-4ttg6\" (UID: \"bbd0b6f8-d458-42e6-a07a-ba22d371037d\") " pod="openstack/barbican-92d3-account-create-update-4ttg6" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.463103 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fedf8a1e-7645-4e0c-800a-e551181e5781-operator-scripts\") pod \"barbican-db-create-99g2f\" (UID: \"fedf8a1e-7645-4e0c-800a-e551181e5781\") " pod="openstack/barbican-db-create-99g2f" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.465642 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d209-account-create-update-b55jg" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.488580 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjfvg\" (UniqueName: \"kubernetes.io/projected/fedf8a1e-7645-4e0c-800a-e551181e5781-kube-api-access-bjfvg\") pod \"barbican-db-create-99g2f\" (UID: \"fedf8a1e-7645-4e0c-800a-e551181e5781\") " pod="openstack/barbican-db-create-99g2f" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.541479 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-s59tq"] Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.542624 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-s59tq" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.566728 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbd0b6f8-d458-42e6-a07a-ba22d371037d-operator-scripts\") pod \"barbican-92d3-account-create-update-4ttg6\" (UID: \"bbd0b6f8-d458-42e6-a07a-ba22d371037d\") " pod="openstack/barbican-92d3-account-create-update-4ttg6" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.567462 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbd0b6f8-d458-42e6-a07a-ba22d371037d-operator-scripts\") pod \"barbican-92d3-account-create-update-4ttg6\" (UID: \"bbd0b6f8-d458-42e6-a07a-ba22d371037d\") " pod="openstack/barbican-92d3-account-create-update-4ttg6" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.567698 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhbht\" (UniqueName: \"kubernetes.io/projected/bbd0b6f8-d458-42e6-a07a-ba22d371037d-kube-api-access-xhbht\") pod \"barbican-92d3-account-create-update-4ttg6\" (UID: \"bbd0b6f8-d458-42e6-a07a-ba22d371037d\") " pod="openstack/barbican-92d3-account-create-update-4ttg6" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.583257 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-s59tq"] Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.589701 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-99g2f" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.619376 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhbht\" (UniqueName: \"kubernetes.io/projected/bbd0b6f8-d458-42e6-a07a-ba22d371037d-kube-api-access-xhbht\") pod \"barbican-92d3-account-create-update-4ttg6\" (UID: \"bbd0b6f8-d458-42e6-a07a-ba22d371037d\") " pod="openstack/barbican-92d3-account-create-update-4ttg6" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.636204 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-2vbhv"] Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.637505 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2vbhv" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.651313 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-h9gmt" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.651523 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.651632 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.668142 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.669337 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a97c7f8-1f59-4d4b-b689-7e5de839d1b9-operator-scripts\") pod \"neutron-db-create-s59tq\" (UID: \"2a97c7f8-1f59-4d4b-b689-7e5de839d1b9\") " pod="openstack/neutron-db-create-s59tq" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.669415 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcq8j\" (UniqueName: \"kubernetes.io/projected/2a97c7f8-1f59-4d4b-b689-7e5de839d1b9-kube-api-access-wcq8j\") pod \"neutron-db-create-s59tq\" (UID: \"2a97c7f8-1f59-4d4b-b689-7e5de839d1b9\") " pod="openstack/neutron-db-create-s59tq" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.691218 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2vbhv"] Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.692803 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-92d3-account-create-update-4ttg6" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.726699 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-bf6f-account-create-update-87lj2"] Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.727863 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bf6f-account-create-update-87lj2" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.733354 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.769502 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bf6f-account-create-update-87lj2"] Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.770359 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcq8j\" (UniqueName: \"kubernetes.io/projected/2a97c7f8-1f59-4d4b-b689-7e5de839d1b9-kube-api-access-wcq8j\") pod \"neutron-db-create-s59tq\" (UID: \"2a97c7f8-1f59-4d4b-b689-7e5de839d1b9\") " pod="openstack/neutron-db-create-s59tq" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.770411 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39cdb2ad-9d97-4f37-90e4-a41f554c8755-combined-ca-bundle\") pod \"keystone-db-sync-2vbhv\" (UID: \"39cdb2ad-9d97-4f37-90e4-a41f554c8755\") " pod="openstack/keystone-db-sync-2vbhv" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.770498 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39cdb2ad-9d97-4f37-90e4-a41f554c8755-config-data\") pod \"keystone-db-sync-2vbhv\" (UID: \"39cdb2ad-9d97-4f37-90e4-a41f554c8755\") " pod="openstack/keystone-db-sync-2vbhv" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.770581 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk6sr\" (UniqueName: \"kubernetes.io/projected/39cdb2ad-9d97-4f37-90e4-a41f554c8755-kube-api-access-hk6sr\") pod \"keystone-db-sync-2vbhv\" (UID: \"39cdb2ad-9d97-4f37-90e4-a41f554c8755\") " pod="openstack/keystone-db-sync-2vbhv" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.771094 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a97c7f8-1f59-4d4b-b689-7e5de839d1b9-operator-scripts\") pod \"neutron-db-create-s59tq\" (UID: \"2a97c7f8-1f59-4d4b-b689-7e5de839d1b9\") " pod="openstack/neutron-db-create-s59tq" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.771770 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a97c7f8-1f59-4d4b-b689-7e5de839d1b9-operator-scripts\") pod \"neutron-db-create-s59tq\" (UID: \"2a97c7f8-1f59-4d4b-b689-7e5de839d1b9\") " pod="openstack/neutron-db-create-s59tq" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.811107 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcq8j\" (UniqueName: \"kubernetes.io/projected/2a97c7f8-1f59-4d4b-b689-7e5de839d1b9-kube-api-access-wcq8j\") pod \"neutron-db-create-s59tq\" (UID: \"2a97c7f8-1f59-4d4b-b689-7e5de839d1b9\") " pod="openstack/neutron-db-create-s59tq" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.873448 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90966ae5-9855-4e64-bebc-fc216f56de50-operator-scripts\") pod \"neutron-bf6f-account-create-update-87lj2\" (UID: \"90966ae5-9855-4e64-bebc-fc216f56de50\") " pod="openstack/neutron-bf6f-account-create-update-87lj2" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.873546 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39cdb2ad-9d97-4f37-90e4-a41f554c8755-combined-ca-bundle\") pod \"keystone-db-sync-2vbhv\" (UID: \"39cdb2ad-9d97-4f37-90e4-a41f554c8755\") " pod="openstack/keystone-db-sync-2vbhv" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.873876 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39cdb2ad-9d97-4f37-90e4-a41f554c8755-config-data\") pod \"keystone-db-sync-2vbhv\" (UID: \"39cdb2ad-9d97-4f37-90e4-a41f554c8755\") " pod="openstack/keystone-db-sync-2vbhv" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.874041 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr6jf\" (UniqueName: \"kubernetes.io/projected/90966ae5-9855-4e64-bebc-fc216f56de50-kube-api-access-nr6jf\") pod \"neutron-bf6f-account-create-update-87lj2\" (UID: \"90966ae5-9855-4e64-bebc-fc216f56de50\") " pod="openstack/neutron-bf6f-account-create-update-87lj2" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.874167 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk6sr\" (UniqueName: \"kubernetes.io/projected/39cdb2ad-9d97-4f37-90e4-a41f554c8755-kube-api-access-hk6sr\") pod \"keystone-db-sync-2vbhv\" (UID: \"39cdb2ad-9d97-4f37-90e4-a41f554c8755\") " pod="openstack/keystone-db-sync-2vbhv" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.881442 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39cdb2ad-9d97-4f37-90e4-a41f554c8755-combined-ca-bundle\") pod \"keystone-db-sync-2vbhv\" (UID: \"39cdb2ad-9d97-4f37-90e4-a41f554c8755\") " pod="openstack/keystone-db-sync-2vbhv" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.895642 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-s59tq" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.896414 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39cdb2ad-9d97-4f37-90e4-a41f554c8755-config-data\") pod \"keystone-db-sync-2vbhv\" (UID: \"39cdb2ad-9d97-4f37-90e4-a41f554c8755\") " pod="openstack/keystone-db-sync-2vbhv" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.919393 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk6sr\" (UniqueName: \"kubernetes.io/projected/39cdb2ad-9d97-4f37-90e4-a41f554c8755-kube-api-access-hk6sr\") pod \"keystone-db-sync-2vbhv\" (UID: \"39cdb2ad-9d97-4f37-90e4-a41f554c8755\") " pod="openstack/keystone-db-sync-2vbhv" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.975672 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr6jf\" (UniqueName: \"kubernetes.io/projected/90966ae5-9855-4e64-bebc-fc216f56de50-kube-api-access-nr6jf\") pod \"neutron-bf6f-account-create-update-87lj2\" (UID: \"90966ae5-9855-4e64-bebc-fc216f56de50\") " pod="openstack/neutron-bf6f-account-create-update-87lj2" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.976592 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90966ae5-9855-4e64-bebc-fc216f56de50-operator-scripts\") pod \"neutron-bf6f-account-create-update-87lj2\" (UID: \"90966ae5-9855-4e64-bebc-fc216f56de50\") " pod="openstack/neutron-bf6f-account-create-update-87lj2" Dec 01 09:27:48 crc kubenswrapper[4867]: I1201 09:27:48.978306 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90966ae5-9855-4e64-bebc-fc216f56de50-operator-scripts\") pod \"neutron-bf6f-account-create-update-87lj2\" (UID: \"90966ae5-9855-4e64-bebc-fc216f56de50\") " pod="openstack/neutron-bf6f-account-create-update-87lj2" Dec 01 09:27:49 crc kubenswrapper[4867]: I1201 09:27:49.000207 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr6jf\" (UniqueName: \"kubernetes.io/projected/90966ae5-9855-4e64-bebc-fc216f56de50-kube-api-access-nr6jf\") pod \"neutron-bf6f-account-create-update-87lj2\" (UID: \"90966ae5-9855-4e64-bebc-fc216f56de50\") " pod="openstack/neutron-bf6f-account-create-update-87lj2" Dec 01 09:27:49 crc kubenswrapper[4867]: I1201 09:27:49.001767 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2vbhv" Dec 01 09:27:49 crc kubenswrapper[4867]: I1201 09:27:49.071314 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bf6f-account-create-update-87lj2" Dec 01 09:27:49 crc kubenswrapper[4867]: I1201 09:27:49.164350 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d209-account-create-update-b55jg"] Dec 01 09:27:49 crc kubenswrapper[4867]: W1201 09:27:49.177215 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fe79080_dd3d_45d8_9929_030bb4eb72c3.slice/crio-07e1cf74a3b8d5c3303465e6a2e257b568c27152a2cf7a8774bdb80a71364d83 WatchSource:0}: Error finding container 07e1cf74a3b8d5c3303465e6a2e257b568c27152a2cf7a8774bdb80a71364d83: Status 404 returned error can't find the container with id 07e1cf74a3b8d5c3303465e6a2e257b568c27152a2cf7a8774bdb80a71364d83 Dec 01 09:27:49 crc kubenswrapper[4867]: I1201 09:27:49.305098 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h82jj"] Dec 01 09:27:49 crc kubenswrapper[4867]: I1201 09:27:49.492261 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h82jj" event={"ID":"fda04dfd-8163-458a-baa4-df9622a4f5c6","Type":"ContainerStarted","Data":"e1929629f060c3ab057f70905858d2d61eb3374a02779b5b70da260baca56994"} Dec 01 09:27:49 crc kubenswrapper[4867]: I1201 09:27:49.494690 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d209-account-create-update-b55jg" event={"ID":"8fe79080-dd3d-45d8-9929-030bb4eb72c3","Type":"ContainerStarted","Data":"07e1cf74a3b8d5c3303465e6a2e257b568c27152a2cf7a8774bdb80a71364d83"} Dec 01 09:27:49 crc kubenswrapper[4867]: I1201 09:27:49.563092 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-99g2f"] Dec 01 09:27:49 crc kubenswrapper[4867]: I1201 09:27:49.626679 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-92d3-account-create-update-4ttg6"] Dec 01 09:27:49 crc kubenswrapper[4867]: W1201 09:27:49.631236 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbd0b6f8_d458_42e6_a07a_ba22d371037d.slice/crio-a5f7341f8c768b69c0905ef6c25eac4d60d1ee9952c0d96e9dbef31b9d66be87 WatchSource:0}: Error finding container a5f7341f8c768b69c0905ef6c25eac4d60d1ee9952c0d96e9dbef31b9d66be87: Status 404 returned error can't find the container with id a5f7341f8c768b69c0905ef6c25eac4d60d1ee9952c0d96e9dbef31b9d66be87 Dec 01 09:27:49 crc kubenswrapper[4867]: I1201 09:27:49.642238 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2vbhv"] Dec 01 09:27:49 crc kubenswrapper[4867]: I1201 09:27:49.679227 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-s59tq"] Dec 01 09:27:49 crc kubenswrapper[4867]: I1201 09:27:49.690569 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bf6f-account-create-update-87lj2"] Dec 01 09:27:49 crc kubenswrapper[4867]: W1201 09:27:49.729206 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90966ae5_9855_4e64_bebc_fc216f56de50.slice/crio-150ff65d28c3fa30702904e41802aa5f757752092106ead00173e238027b7e54 WatchSource:0}: Error finding container 150ff65d28c3fa30702904e41802aa5f757752092106ead00173e238027b7e54: Status 404 returned error can't find the container with id 150ff65d28c3fa30702904e41802aa5f757752092106ead00173e238027b7e54 Dec 01 09:27:50 crc kubenswrapper[4867]: I1201 09:27:50.509265 4867 generic.go:334] "Generic (PLEG): container finished" podID="8fe79080-dd3d-45d8-9929-030bb4eb72c3" containerID="14a3f72edb9ee9270095c0c2c342697f4e91a64667ae6a96d0cdf7f28921ec3a" exitCode=0 Dec 01 09:27:50 crc kubenswrapper[4867]: I1201 09:27:50.509332 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d209-account-create-update-b55jg" event={"ID":"8fe79080-dd3d-45d8-9929-030bb4eb72c3","Type":"ContainerDied","Data":"14a3f72edb9ee9270095c0c2c342697f4e91a64667ae6a96d0cdf7f28921ec3a"} Dec 01 09:27:50 crc kubenswrapper[4867]: I1201 09:27:50.513144 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2vbhv" event={"ID":"39cdb2ad-9d97-4f37-90e4-a41f554c8755","Type":"ContainerStarted","Data":"babb86ff73430870ace32874ac73285df05d22ef9c04ccdb9dd78037999f41ba"} Dec 01 09:27:50 crc kubenswrapper[4867]: I1201 09:27:50.515113 4867 generic.go:334] "Generic (PLEG): container finished" podID="90966ae5-9855-4e64-bebc-fc216f56de50" containerID="29a54cec9fb8cfd42529134bc3244a9cd16da33451aba1da17eabdb5d5387c87" exitCode=0 Dec 01 09:27:50 crc kubenswrapper[4867]: I1201 09:27:50.515223 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bf6f-account-create-update-87lj2" event={"ID":"90966ae5-9855-4e64-bebc-fc216f56de50","Type":"ContainerDied","Data":"29a54cec9fb8cfd42529134bc3244a9cd16da33451aba1da17eabdb5d5387c87"} Dec 01 09:27:50 crc kubenswrapper[4867]: I1201 09:27:50.515248 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bf6f-account-create-update-87lj2" event={"ID":"90966ae5-9855-4e64-bebc-fc216f56de50","Type":"ContainerStarted","Data":"150ff65d28c3fa30702904e41802aa5f757752092106ead00173e238027b7e54"} Dec 01 09:27:50 crc kubenswrapper[4867]: I1201 09:27:50.516503 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-s59tq" event={"ID":"2a97c7f8-1f59-4d4b-b689-7e5de839d1b9","Type":"ContainerStarted","Data":"b90db1a3d03ded2af337174bb31955a2d8ef7554ab85f2979bb089eecae73e82"} Dec 01 09:27:50 crc kubenswrapper[4867]: I1201 09:27:50.516545 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-s59tq" event={"ID":"2a97c7f8-1f59-4d4b-b689-7e5de839d1b9","Type":"ContainerStarted","Data":"455dce731a725c22705537dbe89b9bafbeb6ef108645384e46846d7c71dd4c58"} Dec 01 09:27:50 crc kubenswrapper[4867]: I1201 09:27:50.517696 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-92d3-account-create-update-4ttg6" event={"ID":"bbd0b6f8-d458-42e6-a07a-ba22d371037d","Type":"ContainerStarted","Data":"3fae2292faff5ca2946cc92ca9bc42d45b109d75c6dd47e4adaedb1369efaf7b"} Dec 01 09:27:50 crc kubenswrapper[4867]: I1201 09:27:50.517863 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-92d3-account-create-update-4ttg6" event={"ID":"bbd0b6f8-d458-42e6-a07a-ba22d371037d","Type":"ContainerStarted","Data":"a5f7341f8c768b69c0905ef6c25eac4d60d1ee9952c0d96e9dbef31b9d66be87"} Dec 01 09:27:50 crc kubenswrapper[4867]: I1201 09:27:50.520644 4867 generic.go:334] "Generic (PLEG): container finished" podID="fda04dfd-8163-458a-baa4-df9622a4f5c6" containerID="8541da6d714fb4be5f095bda500a278e17d2bb50ae3359126b8837171fb7851b" exitCode=0 Dec 01 09:27:50 crc kubenswrapper[4867]: I1201 09:27:50.520691 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h82jj" event={"ID":"fda04dfd-8163-458a-baa4-df9622a4f5c6","Type":"ContainerDied","Data":"8541da6d714fb4be5f095bda500a278e17d2bb50ae3359126b8837171fb7851b"} Dec 01 09:27:50 crc kubenswrapper[4867]: I1201 09:27:50.522339 4867 generic.go:334] "Generic (PLEG): container finished" podID="fedf8a1e-7645-4e0c-800a-e551181e5781" containerID="6bc8c6f1329022f1c5a544e78e2955000f239bcac76f4222378fe9d25b355391" exitCode=0 Dec 01 09:27:50 crc kubenswrapper[4867]: I1201 09:27:50.522377 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-99g2f" event={"ID":"fedf8a1e-7645-4e0c-800a-e551181e5781","Type":"ContainerDied","Data":"6bc8c6f1329022f1c5a544e78e2955000f239bcac76f4222378fe9d25b355391"} Dec 01 09:27:50 crc kubenswrapper[4867]: I1201 09:27:50.522395 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-99g2f" event={"ID":"fedf8a1e-7645-4e0c-800a-e551181e5781","Type":"ContainerStarted","Data":"1411f8fbf0aef246d9a1577516643932d9c5a0ce3e4d4842eb460e31c1bed992"} Dec 01 09:27:50 crc kubenswrapper[4867]: I1201 09:27:50.545550 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-92d3-account-create-update-4ttg6" podStartSLOduration=2.545530272 podStartE2EDuration="2.545530272s" podCreationTimestamp="2025-12-01 09:27:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:27:50.542678773 +0000 UTC m=+1192.002065537" watchObservedRunningTime="2025-12-01 09:27:50.545530272 +0000 UTC m=+1192.004917016" Dec 01 09:27:50 crc kubenswrapper[4867]: I1201 09:27:50.559710 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-s59tq" podStartSLOduration=2.559692341 podStartE2EDuration="2.559692341s" podCreationTimestamp="2025-12-01 09:27:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:27:50.555137705 +0000 UTC m=+1192.014524449" watchObservedRunningTime="2025-12-01 09:27:50.559692341 +0000 UTC m=+1192.019079095" Dec 01 09:27:51 crc kubenswrapper[4867]: I1201 09:27:51.115049 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:27:51 crc kubenswrapper[4867]: I1201 09:27:51.179754 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jc265"] Dec 01 09:27:51 crc kubenswrapper[4867]: I1201 09:27:51.180051 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-jc265" podUID="7e2bcd3c-d57d-422c-921a-b4fadc65cb6f" containerName="dnsmasq-dns" containerID="cri-o://ecb815e96f46608a0a9c9c964ae75a0bc5ba7dcc4f294c45ec5ca6157cf1778b" gracePeriod=10 Dec 01 09:27:51 crc kubenswrapper[4867]: I1201 09:27:51.536467 4867 generic.go:334] "Generic (PLEG): container finished" podID="2a97c7f8-1f59-4d4b-b689-7e5de839d1b9" containerID="b90db1a3d03ded2af337174bb31955a2d8ef7554ab85f2979bb089eecae73e82" exitCode=0 Dec 01 09:27:51 crc kubenswrapper[4867]: I1201 09:27:51.536529 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-s59tq" event={"ID":"2a97c7f8-1f59-4d4b-b689-7e5de839d1b9","Type":"ContainerDied","Data":"b90db1a3d03ded2af337174bb31955a2d8ef7554ab85f2979bb089eecae73e82"} Dec 01 09:27:51 crc kubenswrapper[4867]: I1201 09:27:51.546019 4867 generic.go:334] "Generic (PLEG): container finished" podID="bbd0b6f8-d458-42e6-a07a-ba22d371037d" containerID="3fae2292faff5ca2946cc92ca9bc42d45b109d75c6dd47e4adaedb1369efaf7b" exitCode=0 Dec 01 09:27:51 crc kubenswrapper[4867]: I1201 09:27:51.546098 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-92d3-account-create-update-4ttg6" event={"ID":"bbd0b6f8-d458-42e6-a07a-ba22d371037d","Type":"ContainerDied","Data":"3fae2292faff5ca2946cc92ca9bc42d45b109d75c6dd47e4adaedb1369efaf7b"} Dec 01 09:27:51 crc kubenswrapper[4867]: I1201 09:27:51.553299 4867 generic.go:334] "Generic (PLEG): container finished" podID="7e2bcd3c-d57d-422c-921a-b4fadc65cb6f" containerID="ecb815e96f46608a0a9c9c964ae75a0bc5ba7dcc4f294c45ec5ca6157cf1778b" exitCode=0 Dec 01 09:27:51 crc kubenswrapper[4867]: I1201 09:27:51.554018 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jc265" event={"ID":"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f","Type":"ContainerDied","Data":"ecb815e96f46608a0a9c9c964ae75a0bc5ba7dcc4f294c45ec5ca6157cf1778b"} Dec 01 09:27:51 crc kubenswrapper[4867]: I1201 09:27:51.818197 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:51.967446 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-99g2f" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.009147 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvpzn\" (UniqueName: \"kubernetes.io/projected/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-kube-api-access-nvpzn\") pod \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\" (UID: \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\") " Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.009529 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-ovsdbserver-nb\") pod \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\" (UID: \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\") " Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.009616 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-ovsdbserver-sb\") pod \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\" (UID: \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\") " Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.009658 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-config\") pod \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\" (UID: \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\") " Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.009709 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-dns-svc\") pod \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\" (UID: \"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f\") " Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.022564 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-kube-api-access-nvpzn" (OuterVolumeSpecName: "kube-api-access-nvpzn") pod "7e2bcd3c-d57d-422c-921a-b4fadc65cb6f" (UID: "7e2bcd3c-d57d-422c-921a-b4fadc65cb6f"). InnerVolumeSpecName "kube-api-access-nvpzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.062630 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-config" (OuterVolumeSpecName: "config") pod "7e2bcd3c-d57d-422c-921a-b4fadc65cb6f" (UID: "7e2bcd3c-d57d-422c-921a-b4fadc65cb6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.066605 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bf6f-account-create-update-87lj2" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.072247 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e2bcd3c-d57d-422c-921a-b4fadc65cb6f" (UID: "7e2bcd3c-d57d-422c-921a-b4fadc65cb6f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.078459 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7e2bcd3c-d57d-422c-921a-b4fadc65cb6f" (UID: "7e2bcd3c-d57d-422c-921a-b4fadc65cb6f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.093159 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e2bcd3c-d57d-422c-921a-b4fadc65cb6f" (UID: "7e2bcd3c-d57d-422c-921a-b4fadc65cb6f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.111208 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fedf8a1e-7645-4e0c-800a-e551181e5781-operator-scripts\") pod \"fedf8a1e-7645-4e0c-800a-e551181e5781\" (UID: \"fedf8a1e-7645-4e0c-800a-e551181e5781\") " Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.111440 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjfvg\" (UniqueName: \"kubernetes.io/projected/fedf8a1e-7645-4e0c-800a-e551181e5781-kube-api-access-bjfvg\") pod \"fedf8a1e-7645-4e0c-800a-e551181e5781\" (UID: \"fedf8a1e-7645-4e0c-800a-e551181e5781\") " Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.111474 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr6jf\" (UniqueName: \"kubernetes.io/projected/90966ae5-9855-4e64-bebc-fc216f56de50-kube-api-access-nr6jf\") pod \"90966ae5-9855-4e64-bebc-fc216f56de50\" (UID: \"90966ae5-9855-4e64-bebc-fc216f56de50\") " Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.111693 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvpzn\" (UniqueName: \"kubernetes.io/projected/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-kube-api-access-nvpzn\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.111707 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.111716 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.111725 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.111733 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.111936 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fedf8a1e-7645-4e0c-800a-e551181e5781-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fedf8a1e-7645-4e0c-800a-e551181e5781" (UID: "fedf8a1e-7645-4e0c-800a-e551181e5781"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.115802 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fedf8a1e-7645-4e0c-800a-e551181e5781-kube-api-access-bjfvg" (OuterVolumeSpecName: "kube-api-access-bjfvg") pod "fedf8a1e-7645-4e0c-800a-e551181e5781" (UID: "fedf8a1e-7645-4e0c-800a-e551181e5781"). InnerVolumeSpecName "kube-api-access-bjfvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.115904 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90966ae5-9855-4e64-bebc-fc216f56de50-kube-api-access-nr6jf" (OuterVolumeSpecName: "kube-api-access-nr6jf") pod "90966ae5-9855-4e64-bebc-fc216f56de50" (UID: "90966ae5-9855-4e64-bebc-fc216f56de50"). InnerVolumeSpecName "kube-api-access-nr6jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.212914 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90966ae5-9855-4e64-bebc-fc216f56de50-operator-scripts\") pod \"90966ae5-9855-4e64-bebc-fc216f56de50\" (UID: \"90966ae5-9855-4e64-bebc-fc216f56de50\") " Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.213375 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjfvg\" (UniqueName: \"kubernetes.io/projected/fedf8a1e-7645-4e0c-800a-e551181e5781-kube-api-access-bjfvg\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.213393 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr6jf\" (UniqueName: \"kubernetes.io/projected/90966ae5-9855-4e64-bebc-fc216f56de50-kube-api-access-nr6jf\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.213406 4867 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fedf8a1e-7645-4e0c-800a-e551181e5781-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.213443 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90966ae5-9855-4e64-bebc-fc216f56de50-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "90966ae5-9855-4e64-bebc-fc216f56de50" (UID: "90966ae5-9855-4e64-bebc-fc216f56de50"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.314575 4867 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90966ae5-9855-4e64-bebc-fc216f56de50-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.600403 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bf6f-account-create-update-87lj2" event={"ID":"90966ae5-9855-4e64-bebc-fc216f56de50","Type":"ContainerDied","Data":"150ff65d28c3fa30702904e41802aa5f757752092106ead00173e238027b7e54"} Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.600442 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="150ff65d28c3fa30702904e41802aa5f757752092106ead00173e238027b7e54" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.600497 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bf6f-account-create-update-87lj2" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.613304 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jc265" event={"ID":"7e2bcd3c-d57d-422c-921a-b4fadc65cb6f","Type":"ContainerDied","Data":"a86f28cd2cb8a5dd0c400beebbbf10839ff2ee11ba7add54389486116380c3b6"} Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.613334 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jc265" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.613364 4867 scope.go:117] "RemoveContainer" containerID="ecb815e96f46608a0a9c9c964ae75a0bc5ba7dcc4f294c45ec5ca6157cf1778b" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.630668 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-99g2f" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.630990 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-99g2f" event={"ID":"fedf8a1e-7645-4e0c-800a-e551181e5781","Type":"ContainerDied","Data":"1411f8fbf0aef246d9a1577516643932d9c5a0ce3e4d4842eb460e31c1bed992"} Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.631025 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1411f8fbf0aef246d9a1577516643932d9c5a0ce3e4d4842eb460e31c1bed992" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.713927 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jc265"] Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.730510 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jc265"] Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.730824 4867 scope.go:117] "RemoveContainer" containerID="32f1a3127e5e3c543c995e38338570ead3ce15746c76a58d7eab12d67f8d3ee5" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.851626 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e2bcd3c-d57d-422c-921a-b4fadc65cb6f" path="/var/lib/kubelet/pods/7e2bcd3c-d57d-422c-921a-b4fadc65cb6f/volumes" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.919878 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h82jj" Dec 01 09:27:52 crc kubenswrapper[4867]: I1201 09:27:52.947604 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d209-account-create-update-b55jg" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.031499 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpxx2\" (UniqueName: \"kubernetes.io/projected/fda04dfd-8163-458a-baa4-df9622a4f5c6-kube-api-access-tpxx2\") pod \"fda04dfd-8163-458a-baa4-df9622a4f5c6\" (UID: \"fda04dfd-8163-458a-baa4-df9622a4f5c6\") " Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.031548 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fda04dfd-8163-458a-baa4-df9622a4f5c6-operator-scripts\") pod \"fda04dfd-8163-458a-baa4-df9622a4f5c6\" (UID: \"fda04dfd-8163-458a-baa4-df9622a4f5c6\") " Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.032673 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda04dfd-8163-458a-baa4-df9622a4f5c6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fda04dfd-8163-458a-baa4-df9622a4f5c6" (UID: "fda04dfd-8163-458a-baa4-df9622a4f5c6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.047596 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda04dfd-8163-458a-baa4-df9622a4f5c6-kube-api-access-tpxx2" (OuterVolumeSpecName: "kube-api-access-tpxx2") pod "fda04dfd-8163-458a-baa4-df9622a4f5c6" (UID: "fda04dfd-8163-458a-baa4-df9622a4f5c6"). InnerVolumeSpecName "kube-api-access-tpxx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.120531 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-92d3-account-create-update-4ttg6" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.131427 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-s59tq" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.132846 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfdb5\" (UniqueName: \"kubernetes.io/projected/8fe79080-dd3d-45d8-9929-030bb4eb72c3-kube-api-access-sfdb5\") pod \"8fe79080-dd3d-45d8-9929-030bb4eb72c3\" (UID: \"8fe79080-dd3d-45d8-9929-030bb4eb72c3\") " Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.132939 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fe79080-dd3d-45d8-9929-030bb4eb72c3-operator-scripts\") pod \"8fe79080-dd3d-45d8-9929-030bb4eb72c3\" (UID: \"8fe79080-dd3d-45d8-9929-030bb4eb72c3\") " Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.133331 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpxx2\" (UniqueName: \"kubernetes.io/projected/fda04dfd-8163-458a-baa4-df9622a4f5c6-kube-api-access-tpxx2\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.133347 4867 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fda04dfd-8163-458a-baa4-df9622a4f5c6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.133625 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fe79080-dd3d-45d8-9929-030bb4eb72c3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8fe79080-dd3d-45d8-9929-030bb4eb72c3" (UID: "8fe79080-dd3d-45d8-9929-030bb4eb72c3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.138869 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fe79080-dd3d-45d8-9929-030bb4eb72c3-kube-api-access-sfdb5" (OuterVolumeSpecName: "kube-api-access-sfdb5") pod "8fe79080-dd3d-45d8-9929-030bb4eb72c3" (UID: "8fe79080-dd3d-45d8-9929-030bb4eb72c3"). InnerVolumeSpecName "kube-api-access-sfdb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.234124 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a97c7f8-1f59-4d4b-b689-7e5de839d1b9-operator-scripts\") pod \"2a97c7f8-1f59-4d4b-b689-7e5de839d1b9\" (UID: \"2a97c7f8-1f59-4d4b-b689-7e5de839d1b9\") " Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.234178 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhbht\" (UniqueName: \"kubernetes.io/projected/bbd0b6f8-d458-42e6-a07a-ba22d371037d-kube-api-access-xhbht\") pod \"bbd0b6f8-d458-42e6-a07a-ba22d371037d\" (UID: \"bbd0b6f8-d458-42e6-a07a-ba22d371037d\") " Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.234232 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcq8j\" (UniqueName: \"kubernetes.io/projected/2a97c7f8-1f59-4d4b-b689-7e5de839d1b9-kube-api-access-wcq8j\") pod \"2a97c7f8-1f59-4d4b-b689-7e5de839d1b9\" (UID: \"2a97c7f8-1f59-4d4b-b689-7e5de839d1b9\") " Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.234255 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbd0b6f8-d458-42e6-a07a-ba22d371037d-operator-scripts\") pod \"bbd0b6f8-d458-42e6-a07a-ba22d371037d\" (UID: \"bbd0b6f8-d458-42e6-a07a-ba22d371037d\") " Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.234563 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfdb5\" (UniqueName: \"kubernetes.io/projected/8fe79080-dd3d-45d8-9929-030bb4eb72c3-kube-api-access-sfdb5\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.234574 4867 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fe79080-dd3d-45d8-9929-030bb4eb72c3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.235026 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbd0b6f8-d458-42e6-a07a-ba22d371037d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bbd0b6f8-d458-42e6-a07a-ba22d371037d" (UID: "bbd0b6f8-d458-42e6-a07a-ba22d371037d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.235351 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a97c7f8-1f59-4d4b-b689-7e5de839d1b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a97c7f8-1f59-4d4b-b689-7e5de839d1b9" (UID: "2a97c7f8-1f59-4d4b-b689-7e5de839d1b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.239004 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a97c7f8-1f59-4d4b-b689-7e5de839d1b9-kube-api-access-wcq8j" (OuterVolumeSpecName: "kube-api-access-wcq8j") pod "2a97c7f8-1f59-4d4b-b689-7e5de839d1b9" (UID: "2a97c7f8-1f59-4d4b-b689-7e5de839d1b9"). InnerVolumeSpecName "kube-api-access-wcq8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.240142 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbd0b6f8-d458-42e6-a07a-ba22d371037d-kube-api-access-xhbht" (OuterVolumeSpecName: "kube-api-access-xhbht") pod "bbd0b6f8-d458-42e6-a07a-ba22d371037d" (UID: "bbd0b6f8-d458-42e6-a07a-ba22d371037d"). InnerVolumeSpecName "kube-api-access-xhbht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.336336 4867 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a97c7f8-1f59-4d4b-b689-7e5de839d1b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.336374 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhbht\" (UniqueName: \"kubernetes.io/projected/bbd0b6f8-d458-42e6-a07a-ba22d371037d-kube-api-access-xhbht\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.336386 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcq8j\" (UniqueName: \"kubernetes.io/projected/2a97c7f8-1f59-4d4b-b689-7e5de839d1b9-kube-api-access-wcq8j\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.336394 4867 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbd0b6f8-d458-42e6-a07a-ba22d371037d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.643573 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-92d3-account-create-update-4ttg6" event={"ID":"bbd0b6f8-d458-42e6-a07a-ba22d371037d","Type":"ContainerDied","Data":"a5f7341f8c768b69c0905ef6c25eac4d60d1ee9952c0d96e9dbef31b9d66be87"} Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.643628 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-92d3-account-create-update-4ttg6" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.643634 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5f7341f8c768b69c0905ef6c25eac4d60d1ee9952c0d96e9dbef31b9d66be87" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.650346 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h82jj" event={"ID":"fda04dfd-8163-458a-baa4-df9622a4f5c6","Type":"ContainerDied","Data":"e1929629f060c3ab057f70905858d2d61eb3374a02779b5b70da260baca56994"} Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.650398 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1929629f060c3ab057f70905858d2d61eb3374a02779b5b70da260baca56994" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.650501 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h82jj" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.654574 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d209-account-create-update-b55jg" event={"ID":"8fe79080-dd3d-45d8-9929-030bb4eb72c3","Type":"ContainerDied","Data":"07e1cf74a3b8d5c3303465e6a2e257b568c27152a2cf7a8774bdb80a71364d83"} Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.654616 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07e1cf74a3b8d5c3303465e6a2e257b568c27152a2cf7a8774bdb80a71364d83" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.654707 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d209-account-create-update-b55jg" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.657655 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-s59tq" event={"ID":"2a97c7f8-1f59-4d4b-b689-7e5de839d1b9","Type":"ContainerDied","Data":"455dce731a725c22705537dbe89b9bafbeb6ef108645384e46846d7c71dd4c58"} Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.657701 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="455dce731a725c22705537dbe89b9bafbeb6ef108645384e46846d7c71dd4c58" Dec 01 09:27:53 crc kubenswrapper[4867]: I1201 09:27:53.657730 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-s59tq" Dec 01 09:27:56 crc kubenswrapper[4867]: I1201 09:27:56.686453 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2vbhv" event={"ID":"39cdb2ad-9d97-4f37-90e4-a41f554c8755","Type":"ContainerStarted","Data":"09424ec7d4f890da08bd6c027064bea9c0b869c3719407b0535305d6868ae74f"} Dec 01 09:27:56 crc kubenswrapper[4867]: I1201 09:27:56.706899 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-2vbhv" podStartSLOduration=2.954950206 podStartE2EDuration="8.706881363s" podCreationTimestamp="2025-12-01 09:27:48 +0000 UTC" firstStartedPulling="2025-12-01 09:27:49.735019033 +0000 UTC m=+1191.194405787" lastFinishedPulling="2025-12-01 09:27:55.48695019 +0000 UTC m=+1196.946336944" observedRunningTime="2025-12-01 09:27:56.69948197 +0000 UTC m=+1198.158868734" watchObservedRunningTime="2025-12-01 09:27:56.706881363 +0000 UTC m=+1198.166268117" Dec 01 09:27:59 crc kubenswrapper[4867]: I1201 09:27:59.711023 4867 generic.go:334] "Generic (PLEG): container finished" podID="39cdb2ad-9d97-4f37-90e4-a41f554c8755" containerID="09424ec7d4f890da08bd6c027064bea9c0b869c3719407b0535305d6868ae74f" exitCode=0 Dec 01 09:27:59 crc kubenswrapper[4867]: I1201 09:27:59.711070 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2vbhv" event={"ID":"39cdb2ad-9d97-4f37-90e4-a41f554c8755","Type":"ContainerDied","Data":"09424ec7d4f890da08bd6c027064bea9c0b869c3719407b0535305d6868ae74f"} Dec 01 09:28:01 crc kubenswrapper[4867]: I1201 09:28:01.008666 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2vbhv" Dec 01 09:28:01 crc kubenswrapper[4867]: I1201 09:28:01.161451 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39cdb2ad-9d97-4f37-90e4-a41f554c8755-combined-ca-bundle\") pod \"39cdb2ad-9d97-4f37-90e4-a41f554c8755\" (UID: \"39cdb2ad-9d97-4f37-90e4-a41f554c8755\") " Dec 01 09:28:01 crc kubenswrapper[4867]: I1201 09:28:01.161497 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk6sr\" (UniqueName: \"kubernetes.io/projected/39cdb2ad-9d97-4f37-90e4-a41f554c8755-kube-api-access-hk6sr\") pod \"39cdb2ad-9d97-4f37-90e4-a41f554c8755\" (UID: \"39cdb2ad-9d97-4f37-90e4-a41f554c8755\") " Dec 01 09:28:01 crc kubenswrapper[4867]: I1201 09:28:01.161554 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39cdb2ad-9d97-4f37-90e4-a41f554c8755-config-data\") pod \"39cdb2ad-9d97-4f37-90e4-a41f554c8755\" (UID: \"39cdb2ad-9d97-4f37-90e4-a41f554c8755\") " Dec 01 09:28:01 crc kubenswrapper[4867]: I1201 09:28:01.181560 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39cdb2ad-9d97-4f37-90e4-a41f554c8755-kube-api-access-hk6sr" (OuterVolumeSpecName: "kube-api-access-hk6sr") pod "39cdb2ad-9d97-4f37-90e4-a41f554c8755" (UID: "39cdb2ad-9d97-4f37-90e4-a41f554c8755"). InnerVolumeSpecName "kube-api-access-hk6sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:28:01 crc kubenswrapper[4867]: I1201 09:28:01.198441 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39cdb2ad-9d97-4f37-90e4-a41f554c8755-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39cdb2ad-9d97-4f37-90e4-a41f554c8755" (UID: "39cdb2ad-9d97-4f37-90e4-a41f554c8755"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:01 crc kubenswrapper[4867]: I1201 09:28:01.212346 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39cdb2ad-9d97-4f37-90e4-a41f554c8755-config-data" (OuterVolumeSpecName: "config-data") pod "39cdb2ad-9d97-4f37-90e4-a41f554c8755" (UID: "39cdb2ad-9d97-4f37-90e4-a41f554c8755"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:01 crc kubenswrapper[4867]: I1201 09:28:01.263569 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39cdb2ad-9d97-4f37-90e4-a41f554c8755-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:01 crc kubenswrapper[4867]: I1201 09:28:01.263598 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk6sr\" (UniqueName: \"kubernetes.io/projected/39cdb2ad-9d97-4f37-90e4-a41f554c8755-kube-api-access-hk6sr\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:01 crc kubenswrapper[4867]: I1201 09:28:01.263610 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39cdb2ad-9d97-4f37-90e4-a41f554c8755-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:01 crc kubenswrapper[4867]: I1201 09:28:01.727034 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2vbhv" event={"ID":"39cdb2ad-9d97-4f37-90e4-a41f554c8755","Type":"ContainerDied","Data":"babb86ff73430870ace32874ac73285df05d22ef9c04ccdb9dd78037999f41ba"} Dec 01 09:28:01 crc kubenswrapper[4867]: I1201 09:28:01.727077 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="babb86ff73430870ace32874ac73285df05d22ef9c04ccdb9dd78037999f41ba" Dec 01 09:28:01 crc kubenswrapper[4867]: I1201 09:28:01.727151 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2vbhv" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.044015 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-4dxnw"] Dec 01 09:28:02 crc kubenswrapper[4867]: E1201 09:28:02.044442 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90966ae5-9855-4e64-bebc-fc216f56de50" containerName="mariadb-account-create-update" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.044459 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="90966ae5-9855-4e64-bebc-fc216f56de50" containerName="mariadb-account-create-update" Dec 01 09:28:02 crc kubenswrapper[4867]: E1201 09:28:02.044475 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fedf8a1e-7645-4e0c-800a-e551181e5781" containerName="mariadb-database-create" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.044485 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="fedf8a1e-7645-4e0c-800a-e551181e5781" containerName="mariadb-database-create" Dec 01 09:28:02 crc kubenswrapper[4867]: E1201 09:28:02.044502 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fda04dfd-8163-458a-baa4-df9622a4f5c6" containerName="mariadb-database-create" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.044510 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda04dfd-8163-458a-baa4-df9622a4f5c6" containerName="mariadb-database-create" Dec 01 09:28:02 crc kubenswrapper[4867]: E1201 09:28:02.044531 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fe79080-dd3d-45d8-9929-030bb4eb72c3" containerName="mariadb-account-create-update" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.044540 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fe79080-dd3d-45d8-9929-030bb4eb72c3" containerName="mariadb-account-create-update" Dec 01 09:28:02 crc kubenswrapper[4867]: E1201 09:28:02.044554 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39cdb2ad-9d97-4f37-90e4-a41f554c8755" containerName="keystone-db-sync" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.044565 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="39cdb2ad-9d97-4f37-90e4-a41f554c8755" containerName="keystone-db-sync" Dec 01 09:28:02 crc kubenswrapper[4867]: E1201 09:28:02.044577 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2bcd3c-d57d-422c-921a-b4fadc65cb6f" containerName="init" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.044585 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2bcd3c-d57d-422c-921a-b4fadc65cb6f" containerName="init" Dec 01 09:28:02 crc kubenswrapper[4867]: E1201 09:28:02.044600 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a97c7f8-1f59-4d4b-b689-7e5de839d1b9" containerName="mariadb-database-create" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.044608 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a97c7f8-1f59-4d4b-b689-7e5de839d1b9" containerName="mariadb-database-create" Dec 01 09:28:02 crc kubenswrapper[4867]: E1201 09:28:02.044624 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2bcd3c-d57d-422c-921a-b4fadc65cb6f" containerName="dnsmasq-dns" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.044632 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2bcd3c-d57d-422c-921a-b4fadc65cb6f" containerName="dnsmasq-dns" Dec 01 09:28:02 crc kubenswrapper[4867]: E1201 09:28:02.044645 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbd0b6f8-d458-42e6-a07a-ba22d371037d" containerName="mariadb-account-create-update" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.044652 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbd0b6f8-d458-42e6-a07a-ba22d371037d" containerName="mariadb-account-create-update" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.044868 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e2bcd3c-d57d-422c-921a-b4fadc65cb6f" containerName="dnsmasq-dns" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.044888 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="fda04dfd-8163-458a-baa4-df9622a4f5c6" containerName="mariadb-database-create" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.044909 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a97c7f8-1f59-4d4b-b689-7e5de839d1b9" containerName="mariadb-database-create" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.044928 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="fedf8a1e-7645-4e0c-800a-e551181e5781" containerName="mariadb-database-create" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.044941 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="39cdb2ad-9d97-4f37-90e4-a41f554c8755" containerName="keystone-db-sync" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.044955 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbd0b6f8-d458-42e6-a07a-ba22d371037d" containerName="mariadb-account-create-update" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.044963 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="90966ae5-9855-4e64-bebc-fc216f56de50" containerName="mariadb-account-create-update" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.044976 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fe79080-dd3d-45d8-9929-030bb4eb72c3" containerName="mariadb-account-create-update" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.046020 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.080389 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-4dxnw"] Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.098852 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-cgnvm"] Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.100685 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.119014 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cgnvm"] Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.119061 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.119286 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-h9gmt" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.119456 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.119628 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.138103 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.197248 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-4dxnw\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.197313 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-dns-svc\") pod \"dnsmasq-dns-847c4cc679-4dxnw\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.197364 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-config-data\") pod \"keystone-bootstrap-cgnvm\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.197388 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-config\") pod \"dnsmasq-dns-847c4cc679-4dxnw\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.197430 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv8hr\" (UniqueName: \"kubernetes.io/projected/3d716775-bb66-4346-94e1-20fb78c474d6-kube-api-access-nv8hr\") pod \"keystone-bootstrap-cgnvm\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.197499 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-credential-keys\") pod \"keystone-bootstrap-cgnvm\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.197530 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-fernet-keys\") pod \"keystone-bootstrap-cgnvm\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.197555 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-combined-ca-bundle\") pod \"keystone-bootstrap-cgnvm\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.197580 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-4dxnw\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.197606 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glkv8\" (UniqueName: \"kubernetes.io/projected/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-kube-api-access-glkv8\") pod \"dnsmasq-dns-847c4cc679-4dxnw\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.197646 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-4dxnw\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.197677 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-scripts\") pod \"keystone-bootstrap-cgnvm\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.299745 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-4dxnw\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.299827 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-scripts\") pod \"keystone-bootstrap-cgnvm\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.299874 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-4dxnw\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.299910 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-dns-svc\") pod \"dnsmasq-dns-847c4cc679-4dxnw\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.299949 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-config-data\") pod \"keystone-bootstrap-cgnvm\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.299969 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-config\") pod \"dnsmasq-dns-847c4cc679-4dxnw\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.300011 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv8hr\" (UniqueName: \"kubernetes.io/projected/3d716775-bb66-4346-94e1-20fb78c474d6-kube-api-access-nv8hr\") pod \"keystone-bootstrap-cgnvm\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.300082 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-credential-keys\") pod \"keystone-bootstrap-cgnvm\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.300115 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-fernet-keys\") pod \"keystone-bootstrap-cgnvm\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.300140 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-combined-ca-bundle\") pod \"keystone-bootstrap-cgnvm\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.300161 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glkv8\" (UniqueName: \"kubernetes.io/projected/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-kube-api-access-glkv8\") pod \"dnsmasq-dns-847c4cc679-4dxnw\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.300184 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-4dxnw\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.301211 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-4dxnw\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.301944 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-4dxnw\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.306370 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-4dxnw\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.307048 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-dns-svc\") pod \"dnsmasq-dns-847c4cc679-4dxnw\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.313152 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-config\") pod \"dnsmasq-dns-847c4cc679-4dxnw\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.320743 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-fernet-keys\") pod \"keystone-bootstrap-cgnvm\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.321051 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-scripts\") pod \"keystone-bootstrap-cgnvm\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.333876 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-credential-keys\") pod \"keystone-bootstrap-cgnvm\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.334766 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-combined-ca-bundle\") pod \"keystone-bootstrap-cgnvm\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.352185 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-config-data\") pod \"keystone-bootstrap-cgnvm\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.363154 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv8hr\" (UniqueName: \"kubernetes.io/projected/3d716775-bb66-4346-94e1-20fb78c474d6-kube-api-access-nv8hr\") pod \"keystone-bootstrap-cgnvm\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.369714 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glkv8\" (UniqueName: \"kubernetes.io/projected/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-kube-api-access-glkv8\") pod \"dnsmasq-dns-847c4cc679-4dxnw\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.380510 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.450569 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.525439 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-56fc8957fc-4449t"] Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.526918 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56fc8957fc-4449t" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.529775 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56fc8957fc-4449t"] Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.542391 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.542637 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-s629p" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.542859 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.543188 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.588124 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-xmtc6"] Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.595725 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.610720 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-56qbp"] Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.619785 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2vzn\" (UniqueName: \"kubernetes.io/projected/88d30955-4c30-4000-b6c7-ab3414b9bb32-kube-api-access-q2vzn\") pod \"horizon-56fc8957fc-4449t\" (UID: \"88d30955-4c30-4000-b6c7-ab3414b9bb32\") " pod="openstack/horizon-56fc8957fc-4449t" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.619976 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88d30955-4c30-4000-b6c7-ab3414b9bb32-config-data\") pod \"horizon-56fc8957fc-4449t\" (UID: \"88d30955-4c30-4000-b6c7-ab3414b9bb32\") " pod="openstack/horizon-56fc8957fc-4449t" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.620091 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88d30955-4c30-4000-b6c7-ab3414b9bb32-scripts\") pod \"horizon-56fc8957fc-4449t\" (UID: \"88d30955-4c30-4000-b6c7-ab3414b9bb32\") " pod="openstack/horizon-56fc8957fc-4449t" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.620219 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/88d30955-4c30-4000-b6c7-ab3414b9bb32-horizon-secret-key\") pod \"horizon-56fc8957fc-4449t\" (UID: \"88d30955-4c30-4000-b6c7-ab3414b9bb32\") " pod="openstack/horizon-56fc8957fc-4449t" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.620408 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88d30955-4c30-4000-b6c7-ab3414b9bb32-logs\") pod \"horizon-56fc8957fc-4449t\" (UID: \"88d30955-4c30-4000-b6c7-ab3414b9bb32\") " pod="openstack/horizon-56fc8957fc-4449t" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.629954 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.631158 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-56qbp" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.638285 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-fqlhc" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.647028 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.647273 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5f4nz" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.647419 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.723545 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65b95ca9-4891-4e69-a789-a21549f94247-etc-machine-id\") pod \"cinder-db-sync-xmtc6\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.723613 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a891c34b-01dc-4e65-ad1d-b21597555988-combined-ca-bundle\") pod \"barbican-db-sync-56qbp\" (UID: \"a891c34b-01dc-4e65-ad1d-b21597555988\") " pod="openstack/barbican-db-sync-56qbp" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.723655 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8jtw\" (UniqueName: \"kubernetes.io/projected/65b95ca9-4891-4e69-a789-a21549f94247-kube-api-access-n8jtw\") pod \"cinder-db-sync-xmtc6\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.723687 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/88d30955-4c30-4000-b6c7-ab3414b9bb32-horizon-secret-key\") pod \"horizon-56fc8957fc-4449t\" (UID: \"88d30955-4c30-4000-b6c7-ab3414b9bb32\") " pod="openstack/horizon-56fc8957fc-4449t" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.723737 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-combined-ca-bundle\") pod \"cinder-db-sync-xmtc6\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.723801 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88d30955-4c30-4000-b6c7-ab3414b9bb32-logs\") pod \"horizon-56fc8957fc-4449t\" (UID: \"88d30955-4c30-4000-b6c7-ab3414b9bb32\") " pod="openstack/horizon-56fc8957fc-4449t" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.723902 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2vzn\" (UniqueName: \"kubernetes.io/projected/88d30955-4c30-4000-b6c7-ab3414b9bb32-kube-api-access-q2vzn\") pod \"horizon-56fc8957fc-4449t\" (UID: \"88d30955-4c30-4000-b6c7-ab3414b9bb32\") " pod="openstack/horizon-56fc8957fc-4449t" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.723970 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a891c34b-01dc-4e65-ad1d-b21597555988-db-sync-config-data\") pod \"barbican-db-sync-56qbp\" (UID: \"a891c34b-01dc-4e65-ad1d-b21597555988\") " pod="openstack/barbican-db-sync-56qbp" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.723997 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-db-sync-config-data\") pod \"cinder-db-sync-xmtc6\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.724029 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-scripts\") pod \"cinder-db-sync-xmtc6\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.724053 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-config-data\") pod \"cinder-db-sync-xmtc6\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.724085 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w979\" (UniqueName: \"kubernetes.io/projected/a891c34b-01dc-4e65-ad1d-b21597555988-kube-api-access-2w979\") pod \"barbican-db-sync-56qbp\" (UID: \"a891c34b-01dc-4e65-ad1d-b21597555988\") " pod="openstack/barbican-db-sync-56qbp" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.724111 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88d30955-4c30-4000-b6c7-ab3414b9bb32-config-data\") pod \"horizon-56fc8957fc-4449t\" (UID: \"88d30955-4c30-4000-b6c7-ab3414b9bb32\") " pod="openstack/horizon-56fc8957fc-4449t" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.724204 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88d30955-4c30-4000-b6c7-ab3414b9bb32-scripts\") pod \"horizon-56fc8957fc-4449t\" (UID: \"88d30955-4c30-4000-b6c7-ab3414b9bb32\") " pod="openstack/horizon-56fc8957fc-4449t" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.725736 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88d30955-4c30-4000-b6c7-ab3414b9bb32-logs\") pod \"horizon-56fc8957fc-4449t\" (UID: \"88d30955-4c30-4000-b6c7-ab3414b9bb32\") " pod="openstack/horizon-56fc8957fc-4449t" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.725900 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88d30955-4c30-4000-b6c7-ab3414b9bb32-scripts\") pod \"horizon-56fc8957fc-4449t\" (UID: \"88d30955-4c30-4000-b6c7-ab3414b9bb32\") " pod="openstack/horizon-56fc8957fc-4449t" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.737419 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88d30955-4c30-4000-b6c7-ab3414b9bb32-config-data\") pod \"horizon-56fc8957fc-4449t\" (UID: \"88d30955-4c30-4000-b6c7-ab3414b9bb32\") " pod="openstack/horizon-56fc8957fc-4449t" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.746877 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-56qbp"] Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.769536 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xmtc6"] Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.779234 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/88d30955-4c30-4000-b6c7-ab3414b9bb32-horizon-secret-key\") pod \"horizon-56fc8957fc-4449t\" (UID: \"88d30955-4c30-4000-b6c7-ab3414b9bb32\") " pod="openstack/horizon-56fc8957fc-4449t" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.803871 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-s52lq"] Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.804963 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s52lq" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.818619 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2vzn\" (UniqueName: \"kubernetes.io/projected/88d30955-4c30-4000-b6c7-ab3414b9bb32-kube-api-access-q2vzn\") pod \"horizon-56fc8957fc-4449t\" (UID: \"88d30955-4c30-4000-b6c7-ab3414b9bb32\") " pod="openstack/horizon-56fc8957fc-4449t" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.824637 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.825903 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.829764 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4smxn" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.834557 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w979\" (UniqueName: \"kubernetes.io/projected/a891c34b-01dc-4e65-ad1d-b21597555988-kube-api-access-2w979\") pod \"barbican-db-sync-56qbp\" (UID: \"a891c34b-01dc-4e65-ad1d-b21597555988\") " pod="openstack/barbican-db-sync-56qbp" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.834888 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65b95ca9-4891-4e69-a789-a21549f94247-etc-machine-id\") pod \"cinder-db-sync-xmtc6\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.834939 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a891c34b-01dc-4e65-ad1d-b21597555988-combined-ca-bundle\") pod \"barbican-db-sync-56qbp\" (UID: \"a891c34b-01dc-4e65-ad1d-b21597555988\") " pod="openstack/barbican-db-sync-56qbp" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.835012 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8jtw\" (UniqueName: \"kubernetes.io/projected/65b95ca9-4891-4e69-a789-a21549f94247-kube-api-access-n8jtw\") pod \"cinder-db-sync-xmtc6\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.835096 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-combined-ca-bundle\") pod \"cinder-db-sync-xmtc6\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.835245 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65b95ca9-4891-4e69-a789-a21549f94247-etc-machine-id\") pod \"cinder-db-sync-xmtc6\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.835365 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a891c34b-01dc-4e65-ad1d-b21597555988-db-sync-config-data\") pod \"barbican-db-sync-56qbp\" (UID: \"a891c34b-01dc-4e65-ad1d-b21597555988\") " pod="openstack/barbican-db-sync-56qbp" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.835396 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-db-sync-config-data\") pod \"cinder-db-sync-xmtc6\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.835436 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-scripts\") pod \"cinder-db-sync-xmtc6\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.835473 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-config-data\") pod \"cinder-db-sync-xmtc6\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.840663 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a891c34b-01dc-4e65-ad1d-b21597555988-combined-ca-bundle\") pod \"barbican-db-sync-56qbp\" (UID: \"a891c34b-01dc-4e65-ad1d-b21597555988\") " pod="openstack/barbican-db-sync-56qbp" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.857111 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-scripts\") pod \"cinder-db-sync-xmtc6\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.857930 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a891c34b-01dc-4e65-ad1d-b21597555988-db-sync-config-data\") pod \"barbican-db-sync-56qbp\" (UID: \"a891c34b-01dc-4e65-ad1d-b21597555988\") " pod="openstack/barbican-db-sync-56qbp" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.858132 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-combined-ca-bundle\") pod \"cinder-db-sync-xmtc6\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.859164 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-config-data\") pod \"cinder-db-sync-xmtc6\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.865640 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-db-sync-config-data\") pod \"cinder-db-sync-xmtc6\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.866578 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w979\" (UniqueName: \"kubernetes.io/projected/a891c34b-01dc-4e65-ad1d-b21597555988-kube-api-access-2w979\") pod \"barbican-db-sync-56qbp\" (UID: \"a891c34b-01dc-4e65-ad1d-b21597555988\") " pod="openstack/barbican-db-sync-56qbp" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.899372 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8jtw\" (UniqueName: \"kubernetes.io/projected/65b95ca9-4891-4e69-a789-a21549f94247-kube-api-access-n8jtw\") pod \"cinder-db-sync-xmtc6\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.919788 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-674b4ff7c9-r6p2x"] Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.922738 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-674b4ff7c9-r6p2x" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.950910 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-s52lq"] Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.969134 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.974253 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-56qbp" Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.977877 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-674b4ff7c9-r6p2x"] Dec 01 09:28:02 crc kubenswrapper[4867]: I1201 09:28:02.988746 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56fc8957fc-4449t" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.012781 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-4dxnw"] Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.028640 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-phzxd"] Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.029696 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-phzxd" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.041585 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd29t\" (UniqueName: \"kubernetes.io/projected/b0894df6-2174-4d0a-9e26-93650fd0e925-kube-api-access-rd29t\") pod \"neutron-db-sync-s52lq\" (UID: \"b0894df6-2174-4d0a-9e26-93650fd0e925\") " pod="openstack/neutron-db-sync-s52lq" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.044013 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b0894df6-2174-4d0a-9e26-93650fd0e925-config\") pod \"neutron-db-sync-s52lq\" (UID: \"b0894df6-2174-4d0a-9e26-93650fd0e925\") " pod="openstack/neutron-db-sync-s52lq" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.044085 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0894df6-2174-4d0a-9e26-93650fd0e925-combined-ca-bundle\") pod \"neutron-db-sync-s52lq\" (UID: \"b0894df6-2174-4d0a-9e26-93650fd0e925\") " pod="openstack/neutron-db-sync-s52lq" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.049654 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.050011 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.050168 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8pv9s" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.058963 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-phzxd"] Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.121573 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-flkpw"] Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.123058 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.147131 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb0c725b-663d-4764-b156-9426923ce046-scripts\") pod \"placement-db-sync-phzxd\" (UID: \"fb0c725b-663d-4764-b156-9426923ce046\") " pod="openstack/placement-db-sync-phzxd" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.147168 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0c725b-663d-4764-b156-9426923ce046-combined-ca-bundle\") pod \"placement-db-sync-phzxd\" (UID: \"fb0c725b-663d-4764-b156-9426923ce046\") " pod="openstack/placement-db-sync-phzxd" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.147198 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd29t\" (UniqueName: \"kubernetes.io/projected/b0894df6-2174-4d0a-9e26-93650fd0e925-kube-api-access-rd29t\") pod \"neutron-db-sync-s52lq\" (UID: \"b0894df6-2174-4d0a-9e26-93650fd0e925\") " pod="openstack/neutron-db-sync-s52lq" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.147231 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxhnb\" (UniqueName: \"kubernetes.io/projected/fb0c725b-663d-4764-b156-9426923ce046-kube-api-access-vxhnb\") pod \"placement-db-sync-phzxd\" (UID: \"fb0c725b-663d-4764-b156-9426923ce046\") " pod="openstack/placement-db-sync-phzxd" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.147254 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ded298f7-555e-4f07-9125-0c2f78158131-logs\") pod \"horizon-674b4ff7c9-r6p2x\" (UID: \"ded298f7-555e-4f07-9125-0c2f78158131\") " pod="openstack/horizon-674b4ff7c9-r6p2x" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.147277 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ded298f7-555e-4f07-9125-0c2f78158131-horizon-secret-key\") pod \"horizon-674b4ff7c9-r6p2x\" (UID: \"ded298f7-555e-4f07-9125-0c2f78158131\") " pod="openstack/horizon-674b4ff7c9-r6p2x" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.147304 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl97c\" (UniqueName: \"kubernetes.io/projected/ded298f7-555e-4f07-9125-0c2f78158131-kube-api-access-wl97c\") pod \"horizon-674b4ff7c9-r6p2x\" (UID: \"ded298f7-555e-4f07-9125-0c2f78158131\") " pod="openstack/horizon-674b4ff7c9-r6p2x" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.147325 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b0894df6-2174-4d0a-9e26-93650fd0e925-config\") pod \"neutron-db-sync-s52lq\" (UID: \"b0894df6-2174-4d0a-9e26-93650fd0e925\") " pod="openstack/neutron-db-sync-s52lq" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.147357 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ded298f7-555e-4f07-9125-0c2f78158131-scripts\") pod \"horizon-674b4ff7c9-r6p2x\" (UID: \"ded298f7-555e-4f07-9125-0c2f78158131\") " pod="openstack/horizon-674b4ff7c9-r6p2x" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.147378 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0894df6-2174-4d0a-9e26-93650fd0e925-combined-ca-bundle\") pod \"neutron-db-sync-s52lq\" (UID: \"b0894df6-2174-4d0a-9e26-93650fd0e925\") " pod="openstack/neutron-db-sync-s52lq" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.147403 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ded298f7-555e-4f07-9125-0c2f78158131-config-data\") pod \"horizon-674b4ff7c9-r6p2x\" (UID: \"ded298f7-555e-4f07-9125-0c2f78158131\") " pod="openstack/horizon-674b4ff7c9-r6p2x" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.153333 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.154695 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.171969 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb0c725b-663d-4764-b156-9426923ce046-logs\") pod \"placement-db-sync-phzxd\" (UID: \"fb0c725b-663d-4764-b156-9426923ce046\") " pod="openstack/placement-db-sync-phzxd" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.173008 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0c725b-663d-4764-b156-9426923ce046-config-data\") pod \"placement-db-sync-phzxd\" (UID: \"fb0c725b-663d-4764-b156-9426923ce046\") " pod="openstack/placement-db-sync-phzxd" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.175551 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.175600 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.175915 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-qmkvk" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.176031 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.203889 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-flkpw"] Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.216598 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd29t\" (UniqueName: \"kubernetes.io/projected/b0894df6-2174-4d0a-9e26-93650fd0e925-kube-api-access-rd29t\") pod \"neutron-db-sync-s52lq\" (UID: \"b0894df6-2174-4d0a-9e26-93650fd0e925\") " pod="openstack/neutron-db-sync-s52lq" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.217695 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0894df6-2174-4d0a-9e26-93650fd0e925-combined-ca-bundle\") pod \"neutron-db-sync-s52lq\" (UID: \"b0894df6-2174-4d0a-9e26-93650fd0e925\") " pod="openstack/neutron-db-sync-s52lq" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.218204 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b0894df6-2174-4d0a-9e26-93650fd0e925-config\") pod \"neutron-db-sync-s52lq\" (UID: \"b0894df6-2174-4d0a-9e26-93650fd0e925\") " pod="openstack/neutron-db-sync-s52lq" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.261676 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.273131 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.275255 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.286684 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.286746 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxhnb\" (UniqueName: \"kubernetes.io/projected/fb0c725b-663d-4764-b156-9426923ce046-kube-api-access-vxhnb\") pod \"placement-db-sync-phzxd\" (UID: \"fb0c725b-663d-4764-b156-9426923ce046\") " pod="openstack/placement-db-sync-phzxd" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.286777 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.286793 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ddz2\" (UniqueName: \"kubernetes.io/projected/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-kube-api-access-4ddz2\") pod \"dnsmasq-dns-785d8bcb8c-flkpw\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.286846 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvpvd\" (UniqueName: \"kubernetes.io/projected/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-kube-api-access-kvpvd\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.286867 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ded298f7-555e-4f07-9125-0c2f78158131-logs\") pod \"horizon-674b4ff7c9-r6p2x\" (UID: \"ded298f7-555e-4f07-9125-0c2f78158131\") " pod="openstack/horizon-674b4ff7c9-r6p2x" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.286893 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ded298f7-555e-4f07-9125-0c2f78158131-horizon-secret-key\") pod \"horizon-674b4ff7c9-r6p2x\" (UID: \"ded298f7-555e-4f07-9125-0c2f78158131\") " pod="openstack/horizon-674b4ff7c9-r6p2x" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.286908 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-flkpw\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.286926 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-flkpw\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.286959 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl97c\" (UniqueName: \"kubernetes.io/projected/ded298f7-555e-4f07-9125-0c2f78158131-kube-api-access-wl97c\") pod \"horizon-674b4ff7c9-r6p2x\" (UID: \"ded298f7-555e-4f07-9125-0c2f78158131\") " pod="openstack/horizon-674b4ff7c9-r6p2x" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.286989 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.287007 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-logs\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.287054 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ded298f7-555e-4f07-9125-0c2f78158131-scripts\") pod \"horizon-674b4ff7c9-r6p2x\" (UID: \"ded298f7-555e-4f07-9125-0c2f78158131\") " pod="openstack/horizon-674b4ff7c9-r6p2x" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.287106 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-config\") pod \"dnsmasq-dns-785d8bcb8c-flkpw\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.287144 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.287167 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ded298f7-555e-4f07-9125-0c2f78158131-config-data\") pod \"horizon-674b4ff7c9-r6p2x\" (UID: \"ded298f7-555e-4f07-9125-0c2f78158131\") " pod="openstack/horizon-674b4ff7c9-r6p2x" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.287257 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb0c725b-663d-4764-b156-9426923ce046-logs\") pod \"placement-db-sync-phzxd\" (UID: \"fb0c725b-663d-4764-b156-9426923ce046\") " pod="openstack/placement-db-sync-phzxd" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.287274 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.287304 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-flkpw\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.287320 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.287342 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-flkpw\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.287373 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0c725b-663d-4764-b156-9426923ce046-config-data\") pod \"placement-db-sync-phzxd\" (UID: \"fb0c725b-663d-4764-b156-9426923ce046\") " pod="openstack/placement-db-sync-phzxd" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.287414 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb0c725b-663d-4764-b156-9426923ce046-scripts\") pod \"placement-db-sync-phzxd\" (UID: \"fb0c725b-663d-4764-b156-9426923ce046\") " pod="openstack/placement-db-sync-phzxd" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.287456 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0c725b-663d-4764-b156-9426923ce046-combined-ca-bundle\") pod \"placement-db-sync-phzxd\" (UID: \"fb0c725b-663d-4764-b156-9426923ce046\") " pod="openstack/placement-db-sync-phzxd" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.290633 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ded298f7-555e-4f07-9125-0c2f78158131-scripts\") pod \"horizon-674b4ff7c9-r6p2x\" (UID: \"ded298f7-555e-4f07-9125-0c2f78158131\") " pod="openstack/horizon-674b4ff7c9-r6p2x" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.291209 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ded298f7-555e-4f07-9125-0c2f78158131-logs\") pod \"horizon-674b4ff7c9-r6p2x\" (UID: \"ded298f7-555e-4f07-9125-0c2f78158131\") " pod="openstack/horizon-674b4ff7c9-r6p2x" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.291541 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ded298f7-555e-4f07-9125-0c2f78158131-config-data\") pod \"horizon-674b4ff7c9-r6p2x\" (UID: \"ded298f7-555e-4f07-9125-0c2f78158131\") " pod="openstack/horizon-674b4ff7c9-r6p2x" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.291776 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb0c725b-663d-4764-b156-9426923ce046-logs\") pod \"placement-db-sync-phzxd\" (UID: \"fb0c725b-663d-4764-b156-9426923ce046\") " pod="openstack/placement-db-sync-phzxd" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.293513 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.293664 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.296421 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb0c725b-663d-4764-b156-9426923ce046-scripts\") pod \"placement-db-sync-phzxd\" (UID: \"fb0c725b-663d-4764-b156-9426923ce046\") " pod="openstack/placement-db-sync-phzxd" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.302260 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0c725b-663d-4764-b156-9426923ce046-combined-ca-bundle\") pod \"placement-db-sync-phzxd\" (UID: \"fb0c725b-663d-4764-b156-9426923ce046\") " pod="openstack/placement-db-sync-phzxd" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.314581 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0c725b-663d-4764-b156-9426923ce046-config-data\") pod \"placement-db-sync-phzxd\" (UID: \"fb0c725b-663d-4764-b156-9426923ce046\") " pod="openstack/placement-db-sync-phzxd" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.327898 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ded298f7-555e-4f07-9125-0c2f78158131-horizon-secret-key\") pod \"horizon-674b4ff7c9-r6p2x\" (UID: \"ded298f7-555e-4f07-9125-0c2f78158131\") " pod="openstack/horizon-674b4ff7c9-r6p2x" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.346357 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.352502 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxhnb\" (UniqueName: \"kubernetes.io/projected/fb0c725b-663d-4764-b156-9426923ce046-kube-api-access-vxhnb\") pod \"placement-db-sync-phzxd\" (UID: \"fb0c725b-663d-4764-b156-9426923ce046\") " pod="openstack/placement-db-sync-phzxd" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.366320 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-phzxd" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.398656 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0509290-ed5f-4982-bae7-8710f1eeb88f-run-httpd\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.398712 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.398742 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-flkpw\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.398763 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.398779 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-flkpw\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.398843 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.398871 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.398902 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.398933 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ddz2\" (UniqueName: \"kubernetes.io/projected/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-kube-api-access-4ddz2\") pod \"dnsmasq-dns-785d8bcb8c-flkpw\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.398953 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvpvd\" (UniqueName: \"kubernetes.io/projected/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-kube-api-access-kvpvd\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.398975 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-flkpw\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.398994 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-flkpw\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.399015 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwqhj\" (UniqueName: \"kubernetes.io/projected/e0509290-ed5f-4982-bae7-8710f1eeb88f-kube-api-access-hwqhj\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.399046 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.399060 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-logs\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.399096 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.399110 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-scripts\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.399132 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-config\") pod \"dnsmasq-dns-785d8bcb8c-flkpw\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.399151 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.399169 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0509290-ed5f-4982-bae7-8710f1eeb88f-log-httpd\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.399198 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-config-data\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.403870 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-flkpw\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.404239 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.410572 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-flkpw\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.410941 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.411155 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.411171 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-logs\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.411974 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-flkpw\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.412487 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-flkpw\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.413002 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-config\") pod \"dnsmasq-dns-785d8bcb8c-flkpw\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.427030 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s52lq" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.427499 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.431008 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.435667 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.436666 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl97c\" (UniqueName: \"kubernetes.io/projected/ded298f7-555e-4f07-9125-0c2f78158131-kube-api-access-wl97c\") pod \"horizon-674b4ff7c9-r6p2x\" (UID: \"ded298f7-555e-4f07-9125-0c2f78158131\") " pod="openstack/horizon-674b4ff7c9-r6p2x" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.436858 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.437853 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.442586 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.477051 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.490663 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvpvd\" (UniqueName: \"kubernetes.io/projected/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-kube-api-access-kvpvd\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.491580 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ddz2\" (UniqueName: \"kubernetes.io/projected/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-kube-api-access-4ddz2\") pod \"dnsmasq-dns-785d8bcb8c-flkpw\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.503220 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.503450 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-scripts\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.503540 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0509290-ed5f-4982-bae7-8710f1eeb88f-log-httpd\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.503622 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-config-data\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.503693 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0509290-ed5f-4982-bae7-8710f1eeb88f-run-httpd\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.503760 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4c79093-2fb1-49d4-9fbd-abca828a44c9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.503911 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.503991 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xptfz\" (UniqueName: \"kubernetes.io/projected/e4c79093-2fb1-49d4-9fbd-abca828a44c9-kube-api-access-xptfz\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.504067 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.504131 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.504203 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.504305 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.504411 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4c79093-2fb1-49d4-9fbd-abca828a44c9-logs\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.504487 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwqhj\" (UniqueName: \"kubernetes.io/projected/e0509290-ed5f-4982-bae7-8710f1eeb88f-kube-api-access-hwqhj\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.504576 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.505485 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0509290-ed5f-4982-bae7-8710f1eeb88f-run-httpd\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.506060 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0509290-ed5f-4982-bae7-8710f1eeb88f-log-httpd\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.513193 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.514736 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.520600 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-scripts\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.546686 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.554096 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.562464 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-config-data\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.562625 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-674b4ff7c9-r6p2x" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.566129 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwqhj\" (UniqueName: \"kubernetes.io/projected/e0509290-ed5f-4982-bae7-8710f1eeb88f-kube-api-access-hwqhj\") pod \"ceilometer-0\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.631267 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.631597 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xptfz\" (UniqueName: \"kubernetes.io/projected/e4c79093-2fb1-49d4-9fbd-abca828a44c9-kube-api-access-xptfz\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.631630 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.631656 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.631688 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.631752 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4c79093-2fb1-49d4-9fbd-abca828a44c9-logs\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.632006 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.632115 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4c79093-2fb1-49d4-9fbd-abca828a44c9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.632634 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4c79093-2fb1-49d4-9fbd-abca828a44c9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.633269 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.635850 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.635974 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4c79093-2fb1-49d4-9fbd-abca828a44c9-logs\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.636721 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.646223 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.670125 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.678993 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xptfz\" (UniqueName: \"kubernetes.io/projected/e4c79093-2fb1-49d4-9fbd-abca828a44c9-kube-api-access-xptfz\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.701643 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.722985 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.752537 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-4dxnw"] Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.780595 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.799564 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" event={"ID":"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f","Type":"ContainerStarted","Data":"ccad30f55f60fa01989d068a5f0453695f37027ecbee650909bdd2314b2816ee"} Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.831772 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 09:28:03 crc kubenswrapper[4867]: I1201 09:28:03.955877 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 09:28:04 crc kubenswrapper[4867]: I1201 09:28:04.024878 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cgnvm"] Dec 01 09:28:04 crc kubenswrapper[4867]: W1201 09:28:04.043601 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d716775_bb66_4346_94e1_20fb78c474d6.slice/crio-6824e02f30714ce87b27892b54e1fe973f8e3131936fe7b67f16dea1c45c957f WatchSource:0}: Error finding container 6824e02f30714ce87b27892b54e1fe973f8e3131936fe7b67f16dea1c45c957f: Status 404 returned error can't find the container with id 6824e02f30714ce87b27892b54e1fe973f8e3131936fe7b67f16dea1c45c957f Dec 01 09:28:04 crc kubenswrapper[4867]: I1201 09:28:04.401456 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56fc8957fc-4449t"] Dec 01 09:28:04 crc kubenswrapper[4867]: I1201 09:28:04.424314 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-56qbp"] Dec 01 09:28:04 crc kubenswrapper[4867]: I1201 09:28:04.432439 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xmtc6"] Dec 01 09:28:04 crc kubenswrapper[4867]: W1201 09:28:04.435369 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda891c34b_01dc_4e65_ad1d_b21597555988.slice/crio-90600535d61aeec92178fd9653eade62cdc9367914d5a4c25737c473d1e50ea7 WatchSource:0}: Error finding container 90600535d61aeec92178fd9653eade62cdc9367914d5a4c25737c473d1e50ea7: Status 404 returned error can't find the container with id 90600535d61aeec92178fd9653eade62cdc9367914d5a4c25737c473d1e50ea7 Dec 01 09:28:04 crc kubenswrapper[4867]: I1201 09:28:04.574436 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-674b4ff7c9-r6p2x"] Dec 01 09:28:04 crc kubenswrapper[4867]: I1201 09:28:04.780694 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-s52lq"] Dec 01 09:28:04 crc kubenswrapper[4867]: I1201 09:28:04.804512 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-phzxd"] Dec 01 09:28:04 crc kubenswrapper[4867]: W1201 09:28:04.804920 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb0c725b_663d_4764_b156_9426923ce046.slice/crio-601a7deec5b6de487d05d80edeea804fe628d8989aa388578fded2fb02f45cdd WatchSource:0}: Error finding container 601a7deec5b6de487d05d80edeea804fe628d8989aa388578fded2fb02f45cdd: Status 404 returned error can't find the container with id 601a7deec5b6de487d05d80edeea804fe628d8989aa388578fded2fb02f45cdd Dec 01 09:28:04 crc kubenswrapper[4867]: I1201 09:28:04.823364 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-flkpw"] Dec 01 09:28:04 crc kubenswrapper[4867]: I1201 09:28:04.824070 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56fc8957fc-4449t" event={"ID":"88d30955-4c30-4000-b6c7-ab3414b9bb32","Type":"ContainerStarted","Data":"4fe8b1d25112b2c4ebfc3188d41b3397ac3f9dbf1e5cf5223160037be0ee2940"} Dec 01 09:28:04 crc kubenswrapper[4867]: I1201 09:28:04.828035 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cgnvm" event={"ID":"3d716775-bb66-4346-94e1-20fb78c474d6","Type":"ContainerStarted","Data":"6824e02f30714ce87b27892b54e1fe973f8e3131936fe7b67f16dea1c45c957f"} Dec 01 09:28:04 crc kubenswrapper[4867]: I1201 09:28:04.843675 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s52lq" event={"ID":"b0894df6-2174-4d0a-9e26-93650fd0e925","Type":"ContainerStarted","Data":"927e7126f3920e5505f19debebdfa8f66b1cea57443989f2bf436c3c3dd6fb0b"} Dec 01 09:28:04 crc kubenswrapper[4867]: I1201 09:28:04.846648 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-56qbp" event={"ID":"a891c34b-01dc-4e65-ad1d-b21597555988","Type":"ContainerStarted","Data":"90600535d61aeec92178fd9653eade62cdc9367914d5a4c25737c473d1e50ea7"} Dec 01 09:28:04 crc kubenswrapper[4867]: I1201 09:28:04.847449 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-674b4ff7c9-r6p2x" event={"ID":"ded298f7-555e-4f07-9125-0c2f78158131","Type":"ContainerStarted","Data":"a27f2bce5627a525ec19ca0caecdbecaeb604c16af3e8382a5d83d8fae5317a7"} Dec 01 09:28:04 crc kubenswrapper[4867]: I1201 09:28:04.848485 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xmtc6" event={"ID":"65b95ca9-4891-4e69-a789-a21549f94247","Type":"ContainerStarted","Data":"2762bd279ced79faaafbafa2f78f02e2ab2a6df0cbc4f5da18d6866d70e6d419"} Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.055420 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.220161 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 09:28:05 crc kubenswrapper[4867]: W1201 09:28:05.267019 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4c79093_2fb1_49d4_9fbd_abca828a44c9.slice/crio-9ceb92834ae8ce485cea67823fb68d46ab65996438a0a4d0feb18e13e3a0e9b0 WatchSource:0}: Error finding container 9ceb92834ae8ce485cea67823fb68d46ab65996438a0a4d0feb18e13e3a0e9b0: Status 404 returned error can't find the container with id 9ceb92834ae8ce485cea67823fb68d46ab65996438a0a4d0feb18e13e3a0e9b0 Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.642328 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.695869 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-56fc8957fc-4449t"] Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.729737 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.772994 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7599b45569-t2jm7"] Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.819609 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7599b45569-t2jm7" Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.873587 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7599b45569-t2jm7"] Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.892502 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0509290-ed5f-4982-bae7-8710f1eeb88f","Type":"ContainerStarted","Data":"4e5cba66c72b3746a5204d32971df880a01c6841f8a84fcc0507a3ba4d3ded53"} Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.894258 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" event={"ID":"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714","Type":"ContainerStarted","Data":"c5f2da736fe042cca441f31b99ed3e62d9daa06d6218835002e6f3cb799bed72"} Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.896597 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-phzxd" event={"ID":"fb0c725b-663d-4764-b156-9426923ce046","Type":"ContainerStarted","Data":"601a7deec5b6de487d05d80edeea804fe628d8989aa388578fded2fb02f45cdd"} Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.911904 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.941207 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d8882ae2-488c-41aa-96a2-b94cca902a3a-horizon-secret-key\") pod \"horizon-7599b45569-t2jm7\" (UID: \"d8882ae2-488c-41aa-96a2-b94cca902a3a\") " pod="openstack/horizon-7599b45569-t2jm7" Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.941247 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8882ae2-488c-41aa-96a2-b94cca902a3a-logs\") pod \"horizon-7599b45569-t2jm7\" (UID: \"d8882ae2-488c-41aa-96a2-b94cca902a3a\") " pod="openstack/horizon-7599b45569-t2jm7" Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.941284 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkqbx\" (UniqueName: \"kubernetes.io/projected/d8882ae2-488c-41aa-96a2-b94cca902a3a-kube-api-access-nkqbx\") pod \"horizon-7599b45569-t2jm7\" (UID: \"d8882ae2-488c-41aa-96a2-b94cca902a3a\") " pod="openstack/horizon-7599b45569-t2jm7" Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.941334 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8882ae2-488c-41aa-96a2-b94cca902a3a-scripts\") pod \"horizon-7599b45569-t2jm7\" (UID: \"d8882ae2-488c-41aa-96a2-b94cca902a3a\") " pod="openstack/horizon-7599b45569-t2jm7" Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.941382 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8882ae2-488c-41aa-96a2-b94cca902a3a-config-data\") pod \"horizon-7599b45569-t2jm7\" (UID: \"d8882ae2-488c-41aa-96a2-b94cca902a3a\") " pod="openstack/horizon-7599b45569-t2jm7" Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.962092 4867 generic.go:334] "Generic (PLEG): container finished" podID="1a4beb9a-ac0b-4cf8-b138-5eb414acee9f" containerID="a53d069653d6668310de541dcd82f860df87a7539ccbee3e1c8aa6a3b1cc4097" exitCode=0 Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.962343 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" event={"ID":"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f","Type":"ContainerDied","Data":"a53d069653d6668310de541dcd82f860df87a7539ccbee3e1c8aa6a3b1cc4097"} Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.983036 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s52lq" event={"ID":"b0894df6-2174-4d0a-9e26-93650fd0e925","Type":"ContainerStarted","Data":"738e1ba32f0935e93690fe3ae7b01f58519ac3f31f26a9814957c63353df460f"} Dec 01 09:28:05 crc kubenswrapper[4867]: I1201 09:28:05.985117 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e4c79093-2fb1-49d4-9fbd-abca828a44c9","Type":"ContainerStarted","Data":"9ceb92834ae8ce485cea67823fb68d46ab65996438a0a4d0feb18e13e3a0e9b0"} Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.009678 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cgnvm" event={"ID":"3d716775-bb66-4346-94e1-20fb78c474d6","Type":"ContainerStarted","Data":"c12cd2cba9dafa9c7cf8aa656c006db0c49575f4b14ae890016ee20a169bbe14"} Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.031324 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-s52lq" podStartSLOduration=4.03117627 podStartE2EDuration="4.03117627s" podCreationTimestamp="2025-12-01 09:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:28:06.016272531 +0000 UTC m=+1207.475659285" watchObservedRunningTime="2025-12-01 09:28:06.03117627 +0000 UTC m=+1207.490563034" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.041040 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-cgnvm" podStartSLOduration=4.04102247 podStartE2EDuration="4.04102247s" podCreationTimestamp="2025-12-01 09:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:28:06.04027453 +0000 UTC m=+1207.499661284" watchObservedRunningTime="2025-12-01 09:28:06.04102247 +0000 UTC m=+1207.500409224" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.045182 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d8882ae2-488c-41aa-96a2-b94cca902a3a-horizon-secret-key\") pod \"horizon-7599b45569-t2jm7\" (UID: \"d8882ae2-488c-41aa-96a2-b94cca902a3a\") " pod="openstack/horizon-7599b45569-t2jm7" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.045230 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8882ae2-488c-41aa-96a2-b94cca902a3a-logs\") pod \"horizon-7599b45569-t2jm7\" (UID: \"d8882ae2-488c-41aa-96a2-b94cca902a3a\") " pod="openstack/horizon-7599b45569-t2jm7" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.045260 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkqbx\" (UniqueName: \"kubernetes.io/projected/d8882ae2-488c-41aa-96a2-b94cca902a3a-kube-api-access-nkqbx\") pod \"horizon-7599b45569-t2jm7\" (UID: \"d8882ae2-488c-41aa-96a2-b94cca902a3a\") " pod="openstack/horizon-7599b45569-t2jm7" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.045314 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8882ae2-488c-41aa-96a2-b94cca902a3a-scripts\") pod \"horizon-7599b45569-t2jm7\" (UID: \"d8882ae2-488c-41aa-96a2-b94cca902a3a\") " pod="openstack/horizon-7599b45569-t2jm7" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.045365 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8882ae2-488c-41aa-96a2-b94cca902a3a-config-data\") pod \"horizon-7599b45569-t2jm7\" (UID: \"d8882ae2-488c-41aa-96a2-b94cca902a3a\") " pod="openstack/horizon-7599b45569-t2jm7" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.046681 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8882ae2-488c-41aa-96a2-b94cca902a3a-config-data\") pod \"horizon-7599b45569-t2jm7\" (UID: \"d8882ae2-488c-41aa-96a2-b94cca902a3a\") " pod="openstack/horizon-7599b45569-t2jm7" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.048960 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8882ae2-488c-41aa-96a2-b94cca902a3a-logs\") pod \"horizon-7599b45569-t2jm7\" (UID: \"d8882ae2-488c-41aa-96a2-b94cca902a3a\") " pod="openstack/horizon-7599b45569-t2jm7" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.049869 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8882ae2-488c-41aa-96a2-b94cca902a3a-scripts\") pod \"horizon-7599b45569-t2jm7\" (UID: \"d8882ae2-488c-41aa-96a2-b94cca902a3a\") " pod="openstack/horizon-7599b45569-t2jm7" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.080419 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkqbx\" (UniqueName: \"kubernetes.io/projected/d8882ae2-488c-41aa-96a2-b94cca902a3a-kube-api-access-nkqbx\") pod \"horizon-7599b45569-t2jm7\" (UID: \"d8882ae2-488c-41aa-96a2-b94cca902a3a\") " pod="openstack/horizon-7599b45569-t2jm7" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.081505 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d8882ae2-488c-41aa-96a2-b94cca902a3a-horizon-secret-key\") pod \"horizon-7599b45569-t2jm7\" (UID: \"d8882ae2-488c-41aa-96a2-b94cca902a3a\") " pod="openstack/horizon-7599b45569-t2jm7" Dec 01 09:28:06 crc kubenswrapper[4867]: E1201 09:28:06.101618 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a4beb9a_ac0b_4cf8_b138_5eb414acee9f.slice/crio-conmon-a53d069653d6668310de541dcd82f860df87a7539ccbee3e1c8aa6a3b1cc4097.scope\": RecentStats: unable to find data in memory cache]" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.202843 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7599b45569-t2jm7" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.220920 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.628965 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.659854 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-ovsdbserver-sb\") pod \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.659948 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-ovsdbserver-nb\") pod \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.660020 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-config\") pod \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.660057 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-dns-svc\") pod \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.660129 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glkv8\" (UniqueName: \"kubernetes.io/projected/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-kube-api-access-glkv8\") pod \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.660161 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-dns-swift-storage-0\") pod \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\" (UID: \"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f\") " Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.725434 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-kube-api-access-glkv8" (OuterVolumeSpecName: "kube-api-access-glkv8") pod "1a4beb9a-ac0b-4cf8-b138-5eb414acee9f" (UID: "1a4beb9a-ac0b-4cf8-b138-5eb414acee9f"). InnerVolumeSpecName "kube-api-access-glkv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.729765 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1a4beb9a-ac0b-4cf8-b138-5eb414acee9f" (UID: "1a4beb9a-ac0b-4cf8-b138-5eb414acee9f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.764374 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glkv8\" (UniqueName: \"kubernetes.io/projected/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-kube-api-access-glkv8\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.764806 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.776380 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1a4beb9a-ac0b-4cf8-b138-5eb414acee9f" (UID: "1a4beb9a-ac0b-4cf8-b138-5eb414acee9f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.791638 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-config" (OuterVolumeSpecName: "config") pod "1a4beb9a-ac0b-4cf8-b138-5eb414acee9f" (UID: "1a4beb9a-ac0b-4cf8-b138-5eb414acee9f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.823775 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1a4beb9a-ac0b-4cf8-b138-5eb414acee9f" (UID: "1a4beb9a-ac0b-4cf8-b138-5eb414acee9f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.836003 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1a4beb9a-ac0b-4cf8-b138-5eb414acee9f" (UID: "1a4beb9a-ac0b-4cf8-b138-5eb414acee9f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.866374 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.866412 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.866422 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.866431 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:06 crc kubenswrapper[4867]: I1201 09:28:06.980512 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7599b45569-t2jm7"] Dec 01 09:28:06 crc kubenswrapper[4867]: W1201 09:28:06.995596 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8882ae2_488c_41aa_96a2_b94cca902a3a.slice/crio-bc7ac0b44fb97deb41cc686450e7e7607346bb8eddc2667fae2588d187015699 WatchSource:0}: Error finding container bc7ac0b44fb97deb41cc686450e7e7607346bb8eddc2667fae2588d187015699: Status 404 returned error can't find the container with id bc7ac0b44fb97deb41cc686450e7e7607346bb8eddc2667fae2588d187015699 Dec 01 09:28:07 crc kubenswrapper[4867]: I1201 09:28:07.025949 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7599b45569-t2jm7" event={"ID":"d8882ae2-488c-41aa-96a2-b94cca902a3a","Type":"ContainerStarted","Data":"bc7ac0b44fb97deb41cc686450e7e7607346bb8eddc2667fae2588d187015699"} Dec 01 09:28:07 crc kubenswrapper[4867]: I1201 09:28:07.030070 4867 generic.go:334] "Generic (PLEG): container finished" podID="5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714" containerID="e499648acc00bf09c624be59d58019ff8d070ffdcbd51ce42df9d55b760259e2" exitCode=0 Dec 01 09:28:07 crc kubenswrapper[4867]: I1201 09:28:07.030161 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" event={"ID":"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714","Type":"ContainerDied","Data":"e499648acc00bf09c624be59d58019ff8d070ffdcbd51ce42df9d55b760259e2"} Dec 01 09:28:07 crc kubenswrapper[4867]: I1201 09:28:07.050917 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" Dec 01 09:28:07 crc kubenswrapper[4867]: I1201 09:28:07.051216 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-4dxnw" event={"ID":"1a4beb9a-ac0b-4cf8-b138-5eb414acee9f","Type":"ContainerDied","Data":"ccad30f55f60fa01989d068a5f0453695f37027ecbee650909bdd2314b2816ee"} Dec 01 09:28:07 crc kubenswrapper[4867]: I1201 09:28:07.051272 4867 scope.go:117] "RemoveContainer" containerID="a53d069653d6668310de541dcd82f860df87a7539ccbee3e1c8aa6a3b1cc4097" Dec 01 09:28:07 crc kubenswrapper[4867]: I1201 09:28:07.060793 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2","Type":"ContainerStarted","Data":"c0cb33c6854945f3e87a92848a4f615cc19c7228393247a8650e63c707ed8654"} Dec 01 09:28:07 crc kubenswrapper[4867]: I1201 09:28:07.069358 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e4c79093-2fb1-49d4-9fbd-abca828a44c9","Type":"ContainerStarted","Data":"f80f32ef76a1e5259200395738dece2f006f173adf48f9885430ab90a04d3e60"} Dec 01 09:28:07 crc kubenswrapper[4867]: I1201 09:28:07.156154 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-4dxnw"] Dec 01 09:28:07 crc kubenswrapper[4867]: I1201 09:28:07.176349 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-4dxnw"] Dec 01 09:28:08 crc kubenswrapper[4867]: I1201 09:28:08.170968 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" event={"ID":"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714","Type":"ContainerStarted","Data":"6e2cccfe484f54023a70759364d017117b21672280aee0dd1f596ffe67b29e55"} Dec 01 09:28:08 crc kubenswrapper[4867]: I1201 09:28:08.171684 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:08 crc kubenswrapper[4867]: I1201 09:28:08.176240 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2","Type":"ContainerStarted","Data":"c7a54dad3df8a3b7ee7088d0da50f65eea913be27ca846d01c8b7e688dd22649"} Dec 01 09:28:08 crc kubenswrapper[4867]: I1201 09:28:08.198205 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" podStartSLOduration=6.198186223 podStartE2EDuration="6.198186223s" podCreationTimestamp="2025-12-01 09:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:28:08.196629771 +0000 UTC m=+1209.656016525" watchObservedRunningTime="2025-12-01 09:28:08.198186223 +0000 UTC m=+1209.657572977" Dec 01 09:28:08 crc kubenswrapper[4867]: I1201 09:28:08.854406 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a4beb9a-ac0b-4cf8-b138-5eb414acee9f" path="/var/lib/kubelet/pods/1a4beb9a-ac0b-4cf8-b138-5eb414acee9f/volumes" Dec 01 09:28:09 crc kubenswrapper[4867]: I1201 09:28:09.190091 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e4c79093-2fb1-49d4-9fbd-abca828a44c9" containerName="glance-log" containerID="cri-o://f80f32ef76a1e5259200395738dece2f006f173adf48f9885430ab90a04d3e60" gracePeriod=30 Dec 01 09:28:09 crc kubenswrapper[4867]: I1201 09:28:09.190506 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e4c79093-2fb1-49d4-9fbd-abca828a44c9" containerName="glance-httpd" containerID="cri-o://835fe82b6ff65e05d44667c0a9eaaff1cac0d1626bb6ee7f2f395995be0bec1a" gracePeriod=30 Dec 01 09:28:09 crc kubenswrapper[4867]: I1201 09:28:09.190540 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e4c79093-2fb1-49d4-9fbd-abca828a44c9","Type":"ContainerStarted","Data":"835fe82b6ff65e05d44667c0a9eaaff1cac0d1626bb6ee7f2f395995be0bec1a"} Dec 01 09:28:09 crc kubenswrapper[4867]: I1201 09:28:09.212891 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.21287276 podStartE2EDuration="6.21287276s" podCreationTimestamp="2025-12-01 09:28:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:28:09.211175673 +0000 UTC m=+1210.670562427" watchObservedRunningTime="2025-12-01 09:28:09.21287276 +0000 UTC m=+1210.672259514" Dec 01 09:28:10 crc kubenswrapper[4867]: I1201 09:28:10.256187 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2","Type":"ContainerStarted","Data":"852c33f1692d9de3d928ef6486d77bbd86b85717f3b8d1ea44ad0419e08963a3"} Dec 01 09:28:10 crc kubenswrapper[4867]: I1201 09:28:10.259419 4867 generic.go:334] "Generic (PLEG): container finished" podID="e4c79093-2fb1-49d4-9fbd-abca828a44c9" containerID="f80f32ef76a1e5259200395738dece2f006f173adf48f9885430ab90a04d3e60" exitCode=143 Dec 01 09:28:10 crc kubenswrapper[4867]: I1201 09:28:10.260051 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e4c79093-2fb1-49d4-9fbd-abca828a44c9","Type":"ContainerDied","Data":"f80f32ef76a1e5259200395738dece2f006f173adf48f9885430ab90a04d3e60"} Dec 01 09:28:11 crc kubenswrapper[4867]: I1201 09:28:11.286827 4867 generic.go:334] "Generic (PLEG): container finished" podID="e4c79093-2fb1-49d4-9fbd-abca828a44c9" containerID="835fe82b6ff65e05d44667c0a9eaaff1cac0d1626bb6ee7f2f395995be0bec1a" exitCode=0 Dec 01 09:28:11 crc kubenswrapper[4867]: I1201 09:28:11.286923 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e4c79093-2fb1-49d4-9fbd-abca828a44c9","Type":"ContainerDied","Data":"835fe82b6ff65e05d44667c0a9eaaff1cac0d1626bb6ee7f2f395995be0bec1a"} Dec 01 09:28:11 crc kubenswrapper[4867]: I1201 09:28:11.288200 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8c6efbdf-ccf5-4184-b7a1-8117f9133ed2" containerName="glance-httpd" containerID="cri-o://852c33f1692d9de3d928ef6486d77bbd86b85717f3b8d1ea44ad0419e08963a3" gracePeriod=30 Dec 01 09:28:11 crc kubenswrapper[4867]: I1201 09:28:11.288470 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8c6efbdf-ccf5-4184-b7a1-8117f9133ed2" containerName="glance-log" containerID="cri-o://c7a54dad3df8a3b7ee7088d0da50f65eea913be27ca846d01c8b7e688dd22649" gracePeriod=30 Dec 01 09:28:11 crc kubenswrapper[4867]: I1201 09:28:11.324928 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.324900704 podStartE2EDuration="9.324900704s" podCreationTimestamp="2025-12-01 09:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:28:11.320271286 +0000 UTC m=+1212.779658050" watchObservedRunningTime="2025-12-01 09:28:11.324900704 +0000 UTC m=+1212.784287468" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.310793 4867 generic.go:334] "Generic (PLEG): container finished" podID="3d716775-bb66-4346-94e1-20fb78c474d6" containerID="c12cd2cba9dafa9c7cf8aa656c006db0c49575f4b14ae890016ee20a169bbe14" exitCode=0 Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.310999 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cgnvm" event={"ID":"3d716775-bb66-4346-94e1-20fb78c474d6","Type":"ContainerDied","Data":"c12cd2cba9dafa9c7cf8aa656c006db0c49575f4b14ae890016ee20a169bbe14"} Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.315593 4867 generic.go:334] "Generic (PLEG): container finished" podID="8c6efbdf-ccf5-4184-b7a1-8117f9133ed2" containerID="852c33f1692d9de3d928ef6486d77bbd86b85717f3b8d1ea44ad0419e08963a3" exitCode=0 Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.315618 4867 generic.go:334] "Generic (PLEG): container finished" podID="8c6efbdf-ccf5-4184-b7a1-8117f9133ed2" containerID="c7a54dad3df8a3b7ee7088d0da50f65eea913be27ca846d01c8b7e688dd22649" exitCode=143 Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.315637 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2","Type":"ContainerDied","Data":"852c33f1692d9de3d928ef6486d77bbd86b85717f3b8d1ea44ad0419e08963a3"} Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.315660 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2","Type":"ContainerDied","Data":"c7a54dad3df8a3b7ee7088d0da50f65eea913be27ca846d01c8b7e688dd22649"} Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.529322 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-674b4ff7c9-r6p2x"] Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.563818 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-c846795f4-k7mlj"] Dec 01 09:28:12 crc kubenswrapper[4867]: E1201 09:28:12.564157 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a4beb9a-ac0b-4cf8-b138-5eb414acee9f" containerName="init" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.564171 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a4beb9a-ac0b-4cf8-b138-5eb414acee9f" containerName="init" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.565678 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a4beb9a-ac0b-4cf8-b138-5eb414acee9f" containerName="init" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.566528 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.575973 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.576905 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c846795f4-k7mlj"] Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.631368 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7599b45569-t2jm7"] Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.664596 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-d47c7cb76-srf4p"] Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.666575 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.679325 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d47c7cb76-srf4p"] Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.689781 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-logs\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.689828 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-config-data\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.689875 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-combined-ca-bundle\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.689893 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-horizon-tls-certs\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.689979 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-horizon-secret-key\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.689998 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-scripts\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.690026 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpbwt\" (UniqueName: \"kubernetes.io/projected/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-kube-api-access-hpbwt\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.841023 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-horizon-secret-key\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.841097 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-scripts\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.841127 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v92j\" (UniqueName: \"kubernetes.io/projected/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-kube-api-access-4v92j\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.841206 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpbwt\" (UniqueName: \"kubernetes.io/projected/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-kube-api-access-hpbwt\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.841231 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-horizon-secret-key\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.841252 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-horizon-tls-certs\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.841274 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-combined-ca-bundle\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.841522 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-logs\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.841545 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-config-data\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.841573 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-scripts\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.841592 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-combined-ca-bundle\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.841612 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-horizon-tls-certs\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.841643 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-logs\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.841676 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-config-data\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.842028 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-logs\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.842767 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-scripts\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.843082 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-config-data\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.847824 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-horizon-secret-key\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.849802 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-combined-ca-bundle\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.858303 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-horizon-tls-certs\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.882626 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpbwt\" (UniqueName: \"kubernetes.io/projected/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-kube-api-access-hpbwt\") pod \"horizon-c846795f4-k7mlj\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.905401 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.952658 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-scripts\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.952725 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-logs\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.952761 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-config-data\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.952816 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v92j\" (UniqueName: \"kubernetes.io/projected/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-kube-api-access-4v92j\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.953615 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-logs\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.961021 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-horizon-secret-key\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.961078 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-horizon-tls-certs\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.961119 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-combined-ca-bundle\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.962393 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-scripts\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.962504 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-config-data\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:12 crc kubenswrapper[4867]: I1201 09:28:12.984925 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-horizon-secret-key\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:13 crc kubenswrapper[4867]: I1201 09:28:13.000032 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-combined-ca-bundle\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:13 crc kubenswrapper[4867]: I1201 09:28:13.002580 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v92j\" (UniqueName: \"kubernetes.io/projected/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-kube-api-access-4v92j\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:13 crc kubenswrapper[4867]: I1201 09:28:13.037922 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bd4fac2-df2c-4aab-bf00-99b54a83ddca-horizon-tls-certs\") pod \"horizon-d47c7cb76-srf4p\" (UID: \"8bd4fac2-df2c-4aab-bf00-99b54a83ddca\") " pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:13 crc kubenswrapper[4867]: I1201 09:28:13.292444 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:13 crc kubenswrapper[4867]: I1201 09:28:13.551732 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:28:13 crc kubenswrapper[4867]: I1201 09:28:13.654345 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-k9s48"] Dec 01 09:28:13 crc kubenswrapper[4867]: I1201 09:28:13.654651 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" podUID="604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" containerName="dnsmasq-dns" containerID="cri-o://9adfed2bde346b21fa4adfc36b9773c70edefc16f3d71c0ef87af1899b53d9d2" gracePeriod=10 Dec 01 09:28:14 crc kubenswrapper[4867]: I1201 09:28:14.341307 4867 generic.go:334] "Generic (PLEG): container finished" podID="604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" containerID="9adfed2bde346b21fa4adfc36b9773c70edefc16f3d71c0ef87af1899b53d9d2" exitCode=0 Dec 01 09:28:14 crc kubenswrapper[4867]: I1201 09:28:14.341600 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" event={"ID":"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a","Type":"ContainerDied","Data":"9adfed2bde346b21fa4adfc36b9773c70edefc16f3d71c0ef87af1899b53d9d2"} Dec 01 09:28:16 crc kubenswrapper[4867]: I1201 09:28:16.122357 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" podUID="604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.124:5353: connect: connection refused" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.125894 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.130069 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.190884 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-internal-tls-certs\") pod \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.190957 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-config-data\") pod \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.191010 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv8hr\" (UniqueName: \"kubernetes.io/projected/3d716775-bb66-4346-94e1-20fb78c474d6-kube-api-access-nv8hr\") pod \"3d716775-bb66-4346-94e1-20fb78c474d6\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.191038 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-combined-ca-bundle\") pod \"3d716775-bb66-4346-94e1-20fb78c474d6\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.191104 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4c79093-2fb1-49d4-9fbd-abca828a44c9-httpd-run\") pod \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.191180 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-config-data\") pod \"3d716775-bb66-4346-94e1-20fb78c474d6\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.191218 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.191237 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-credential-keys\") pod \"3d716775-bb66-4346-94e1-20fb78c474d6\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.191272 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-scripts\") pod \"3d716775-bb66-4346-94e1-20fb78c474d6\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.191313 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xptfz\" (UniqueName: \"kubernetes.io/projected/e4c79093-2fb1-49d4-9fbd-abca828a44c9-kube-api-access-xptfz\") pod \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.191342 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-combined-ca-bundle\") pod \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.191358 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-scripts\") pod \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.191387 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-fernet-keys\") pod \"3d716775-bb66-4346-94e1-20fb78c474d6\" (UID: \"3d716775-bb66-4346-94e1-20fb78c474d6\") " Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.191403 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4c79093-2fb1-49d4-9fbd-abca828a44c9-logs\") pod \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\" (UID: \"e4c79093-2fb1-49d4-9fbd-abca828a44c9\") " Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.192907 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4c79093-2fb1-49d4-9fbd-abca828a44c9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e4c79093-2fb1-49d4-9fbd-abca828a44c9" (UID: "e4c79093-2fb1-49d4-9fbd-abca828a44c9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.193197 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4c79093-2fb1-49d4-9fbd-abca828a44c9-logs" (OuterVolumeSpecName: "logs") pod "e4c79093-2fb1-49d4-9fbd-abca828a44c9" (UID: "e4c79093-2fb1-49d4-9fbd-abca828a44c9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.206094 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3d716775-bb66-4346-94e1-20fb78c474d6" (UID: "3d716775-bb66-4346-94e1-20fb78c474d6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.220019 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-scripts" (OuterVolumeSpecName: "scripts") pod "3d716775-bb66-4346-94e1-20fb78c474d6" (UID: "3d716775-bb66-4346-94e1-20fb78c474d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.221055 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3d716775-bb66-4346-94e1-20fb78c474d6" (UID: "3d716775-bb66-4346-94e1-20fb78c474d6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.221573 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-scripts" (OuterVolumeSpecName: "scripts") pod "e4c79093-2fb1-49d4-9fbd-abca828a44c9" (UID: "e4c79093-2fb1-49d4-9fbd-abca828a44c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.223872 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4c79093-2fb1-49d4-9fbd-abca828a44c9-kube-api-access-xptfz" (OuterVolumeSpecName: "kube-api-access-xptfz") pod "e4c79093-2fb1-49d4-9fbd-abca828a44c9" (UID: "e4c79093-2fb1-49d4-9fbd-abca828a44c9"). InnerVolumeSpecName "kube-api-access-xptfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.234160 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d716775-bb66-4346-94e1-20fb78c474d6-kube-api-access-nv8hr" (OuterVolumeSpecName: "kube-api-access-nv8hr") pod "3d716775-bb66-4346-94e1-20fb78c474d6" (UID: "3d716775-bb66-4346-94e1-20fb78c474d6"). InnerVolumeSpecName "kube-api-access-nv8hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.296379 4867 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.296592 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.296666 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xptfz\" (UniqueName: \"kubernetes.io/projected/e4c79093-2fb1-49d4-9fbd-abca828a44c9-kube-api-access-xptfz\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.296773 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.297100 4867 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.297194 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4c79093-2fb1-49d4-9fbd-abca828a44c9-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.297270 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv8hr\" (UniqueName: \"kubernetes.io/projected/3d716775-bb66-4346-94e1-20fb78c474d6-kube-api-access-nv8hr\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.297344 4867 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4c79093-2fb1-49d4-9fbd-abca828a44c9-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.300440 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "e4c79093-2fb1-49d4-9fbd-abca828a44c9" (UID: "e4c79093-2fb1-49d4-9fbd-abca828a44c9"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.305658 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4c79093-2fb1-49d4-9fbd-abca828a44c9" (UID: "e4c79093-2fb1-49d4-9fbd-abca828a44c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.364438 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-config-data" (OuterVolumeSpecName: "config-data") pod "e4c79093-2fb1-49d4-9fbd-abca828a44c9" (UID: "e4c79093-2fb1-49d4-9fbd-abca828a44c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.374119 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d716775-bb66-4346-94e1-20fb78c474d6" (UID: "3d716775-bb66-4346-94e1-20fb78c474d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.395796 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-config-data" (OuterVolumeSpecName: "config-data") pod "3d716775-bb66-4346-94e1-20fb78c474d6" (UID: "3d716775-bb66-4346-94e1-20fb78c474d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.404725 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.404760 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.404775 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d716775-bb66-4346-94e1-20fb78c474d6-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.404801 4867 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.404817 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.421678 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cgnvm" event={"ID":"3d716775-bb66-4346-94e1-20fb78c474d6","Type":"ContainerDied","Data":"6824e02f30714ce87b27892b54e1fe973f8e3131936fe7b67f16dea1c45c957f"} Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.421713 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6824e02f30714ce87b27892b54e1fe973f8e3131936fe7b67f16dea1c45c957f" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.421773 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cgnvm" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.424242 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e4c79093-2fb1-49d4-9fbd-abca828a44c9","Type":"ContainerDied","Data":"9ceb92834ae8ce485cea67823fb68d46ab65996438a0a4d0feb18e13e3a0e9b0"} Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.424286 4867 scope.go:117] "RemoveContainer" containerID="835fe82b6ff65e05d44667c0a9eaaff1cac0d1626bb6ee7f2f395995be0bec1a" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.424406 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.431209 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e4c79093-2fb1-49d4-9fbd-abca828a44c9" (UID: "e4c79093-2fb1-49d4-9fbd-abca828a44c9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.454331 4867 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.507159 4867 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.507190 4867 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4c79093-2fb1-49d4-9fbd-abca828a44c9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.761011 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.790446 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.808066 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 09:28:18 crc kubenswrapper[4867]: E1201 09:28:18.808888 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4c79093-2fb1-49d4-9fbd-abca828a44c9" containerName="glance-log" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.808912 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4c79093-2fb1-49d4-9fbd-abca828a44c9" containerName="glance-log" Dec 01 09:28:18 crc kubenswrapper[4867]: E1201 09:28:18.808935 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d716775-bb66-4346-94e1-20fb78c474d6" containerName="keystone-bootstrap" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.808942 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d716775-bb66-4346-94e1-20fb78c474d6" containerName="keystone-bootstrap" Dec 01 09:28:18 crc kubenswrapper[4867]: E1201 09:28:18.808966 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4c79093-2fb1-49d4-9fbd-abca828a44c9" containerName="glance-httpd" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.808978 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4c79093-2fb1-49d4-9fbd-abca828a44c9" containerName="glance-httpd" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.809332 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4c79093-2fb1-49d4-9fbd-abca828a44c9" containerName="glance-log" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.809363 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4c79093-2fb1-49d4-9fbd-abca828a44c9" containerName="glance-httpd" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.809380 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d716775-bb66-4346-94e1-20fb78c474d6" containerName="keystone-bootstrap" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.811078 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.832454 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.834539 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.867224 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4c79093-2fb1-49d4-9fbd-abca828a44c9" path="/var/lib/kubelet/pods/e4c79093-2fb1-49d4-9fbd-abca828a44c9/volumes" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.867744 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.916079 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.916345 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.916434 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1179a242-dbf2-4bc1-888b-33f22df356a6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.916567 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.916653 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1179a242-dbf2-4bc1-888b-33f22df356a6-logs\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.916776 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.916925 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxwc4\" (UniqueName: \"kubernetes.io/projected/1179a242-dbf2-4bc1-888b-33f22df356a6-kube-api-access-nxwc4\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:18 crc kubenswrapper[4867]: I1201 09:28:18.917101 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.018442 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.018543 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.018567 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1179a242-dbf2-4bc1-888b-33f22df356a6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.018608 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.018640 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1179a242-dbf2-4bc1-888b-33f22df356a6-logs\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.018684 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.018726 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxwc4\" (UniqueName: \"kubernetes.io/projected/1179a242-dbf2-4bc1-888b-33f22df356a6-kube-api-access-nxwc4\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.018802 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.022807 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.023573 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.024684 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.028568 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1179a242-dbf2-4bc1-888b-33f22df356a6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.031325 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1179a242-dbf2-4bc1-888b-33f22df356a6-logs\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.037297 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.041356 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.044307 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxwc4\" (UniqueName: \"kubernetes.io/projected/1179a242-dbf2-4bc1-888b-33f22df356a6-kube-api-access-nxwc4\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.056146 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.138990 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.247981 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-cgnvm"] Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.254621 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-cgnvm"] Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.361514 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-dq766"] Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.363894 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dq766" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.368227 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-h9gmt" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.368401 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.368428 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.368401 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.368757 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.381128 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dq766"] Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.425040 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-credential-keys\") pod \"keystone-bootstrap-dq766\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " pod="openstack/keystone-bootstrap-dq766" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.425119 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-config-data\") pod \"keystone-bootstrap-dq766\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " pod="openstack/keystone-bootstrap-dq766" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.427278 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-combined-ca-bundle\") pod \"keystone-bootstrap-dq766\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " pod="openstack/keystone-bootstrap-dq766" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.427321 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-fernet-keys\") pod \"keystone-bootstrap-dq766\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " pod="openstack/keystone-bootstrap-dq766" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.427344 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-scripts\") pod \"keystone-bootstrap-dq766\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " pod="openstack/keystone-bootstrap-dq766" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.427361 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km4gn\" (UniqueName: \"kubernetes.io/projected/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-kube-api-access-km4gn\") pod \"keystone-bootstrap-dq766\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " pod="openstack/keystone-bootstrap-dq766" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.529104 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-credential-keys\") pod \"keystone-bootstrap-dq766\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " pod="openstack/keystone-bootstrap-dq766" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.529183 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-config-data\") pod \"keystone-bootstrap-dq766\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " pod="openstack/keystone-bootstrap-dq766" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.529279 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-combined-ca-bundle\") pod \"keystone-bootstrap-dq766\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " pod="openstack/keystone-bootstrap-dq766" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.529301 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-fernet-keys\") pod \"keystone-bootstrap-dq766\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " pod="openstack/keystone-bootstrap-dq766" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.529321 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-scripts\") pod \"keystone-bootstrap-dq766\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " pod="openstack/keystone-bootstrap-dq766" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.529348 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km4gn\" (UniqueName: \"kubernetes.io/projected/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-kube-api-access-km4gn\") pod \"keystone-bootstrap-dq766\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " pod="openstack/keystone-bootstrap-dq766" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.537550 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-credential-keys\") pod \"keystone-bootstrap-dq766\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " pod="openstack/keystone-bootstrap-dq766" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.538057 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-combined-ca-bundle\") pod \"keystone-bootstrap-dq766\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " pod="openstack/keystone-bootstrap-dq766" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.541423 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-fernet-keys\") pod \"keystone-bootstrap-dq766\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " pod="openstack/keystone-bootstrap-dq766" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.549557 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-config-data\") pod \"keystone-bootstrap-dq766\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " pod="openstack/keystone-bootstrap-dq766" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.550211 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-scripts\") pod \"keystone-bootstrap-dq766\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " pod="openstack/keystone-bootstrap-dq766" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.570423 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km4gn\" (UniqueName: \"kubernetes.io/projected/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-kube-api-access-km4gn\") pod \"keystone-bootstrap-dq766\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " pod="openstack/keystone-bootstrap-dq766" Dec 01 09:28:19 crc kubenswrapper[4867]: I1201 09:28:19.741952 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dq766" Dec 01 09:28:20 crc kubenswrapper[4867]: I1201 09:28:20.837069 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d716775-bb66-4346-94e1-20fb78c474d6" path="/var/lib/kubelet/pods/3d716775-bb66-4346-94e1-20fb78c474d6/volumes" Dec 01 09:28:21 crc kubenswrapper[4867]: I1201 09:28:21.107727 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" podUID="604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.124:5353: connect: connection refused" Dec 01 09:28:26 crc kubenswrapper[4867]: I1201 09:28:26.111843 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" podUID="604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.124:5353: connect: connection refused" Dec 01 09:28:26 crc kubenswrapper[4867]: I1201 09:28:26.112453 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:28:27 crc kubenswrapper[4867]: E1201 09:28:27.943314 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 01 09:28:27 crc kubenswrapper[4867]: E1201 09:28:27.943735 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58fh64ch5c6h5f7h5c9h576hb9h575h66fh58fh65bh577hc4h58h64dh676h695h6ch57dhc6h658h57dh5b4h66fh88h64bhc4h565hd4h54bhd6h8bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nkqbx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7599b45569-t2jm7_openstack(d8882ae2-488c-41aa-96a2-b94cca902a3a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:28:27 crc kubenswrapper[4867]: E1201 09:28:27.946168 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7599b45569-t2jm7" podUID="d8882ae2-488c-41aa-96a2-b94cca902a3a" Dec 01 09:28:27 crc kubenswrapper[4867]: E1201 09:28:27.952909 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 01 09:28:27 crc kubenswrapper[4867]: E1201 09:28:27.953104 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n89h65bh5d7h8dh654hffh5hb5hb5h695h86h5bbh7dh586h5b9h76h698h65h644h87h646h556h7h5dfhbfh8dh57bh594h79h5b8h5c6h57bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wl97c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-674b4ff7c9-r6p2x_openstack(ded298f7-555e-4f07-9125-0c2f78158131): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:28:27 crc kubenswrapper[4867]: E1201 09:28:27.965054 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-674b4ff7c9-r6p2x" podUID="ded298f7-555e-4f07-9125-0c2f78158131" Dec 01 09:28:27 crc kubenswrapper[4867]: E1201 09:28:27.980184 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 01 09:28:27 crc kubenswrapper[4867]: E1201 09:28:27.980317 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n64h7fh5chd5h658h647h78h5f8hb4h5d7h68bh6bh556h54fh657h5b8h587h5fbhcbhfdh54fh76h64dh7ch576h6ch66ch87h58fh76h67h5f4q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q2vzn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-56fc8957fc-4449t_openstack(88d30955-4c30-4000-b6c7-ab3414b9bb32): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:28:27 crc kubenswrapper[4867]: E1201 09:28:27.982288 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-56fc8957fc-4449t" podUID="88d30955-4c30-4000-b6c7-ab3414b9bb32" Dec 01 09:28:28 crc kubenswrapper[4867]: I1201 09:28:28.545038 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d47c7cb76-srf4p"] Dec 01 09:28:33 crc kubenswrapper[4867]: I1201 09:28:33.834223 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 09:28:33 crc kubenswrapper[4867]: I1201 09:28:33.834533 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 09:28:36 crc kubenswrapper[4867]: I1201 09:28:36.107975 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" podUID="604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.124:5353: i/o timeout" Dec 01 09:28:41 crc kubenswrapper[4867]: I1201 09:28:41.109492 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" podUID="604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.124:5353: i/o timeout" Dec 01 09:28:43 crc kubenswrapper[4867]: E1201 09:28:43.484706 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 01 09:28:43 crc kubenswrapper[4867]: E1201 09:28:43.485261 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2w979,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-56qbp_openstack(a891c34b-01dc-4e65-ad1d-b21597555988): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:28:43 crc kubenswrapper[4867]: E1201 09:28:43.486702 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-56qbp" podUID="a891c34b-01dc-4e65-ad1d-b21597555988" Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.568916 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.659868 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-httpd-run\") pod \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.660171 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-logs\") pod \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.660211 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-combined-ca-bundle\") pod \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.660266 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.660266 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8c6efbdf-ccf5-4184-b7a1-8117f9133ed2" (UID: "8c6efbdf-ccf5-4184-b7a1-8117f9133ed2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.660344 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-public-tls-certs\") pod \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.660373 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-scripts\") pod \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.660443 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvpvd\" (UniqueName: \"kubernetes.io/projected/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-kube-api-access-kvpvd\") pod \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.660465 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-config-data\") pod \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\" (UID: \"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2\") " Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.660514 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-logs" (OuterVolumeSpecName: "logs") pod "8c6efbdf-ccf5-4184-b7a1-8117f9133ed2" (UID: "8c6efbdf-ccf5-4184-b7a1-8117f9133ed2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.660880 4867 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.660899 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.669362 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "8c6efbdf-ccf5-4184-b7a1-8117f9133ed2" (UID: "8c6efbdf-ccf5-4184-b7a1-8117f9133ed2"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.669384 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-scripts" (OuterVolumeSpecName: "scripts") pod "8c6efbdf-ccf5-4184-b7a1-8117f9133ed2" (UID: "8c6efbdf-ccf5-4184-b7a1-8117f9133ed2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.679400 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-kube-api-access-kvpvd" (OuterVolumeSpecName: "kube-api-access-kvpvd") pod "8c6efbdf-ccf5-4184-b7a1-8117f9133ed2" (UID: "8c6efbdf-ccf5-4184-b7a1-8117f9133ed2"). InnerVolumeSpecName "kube-api-access-kvpvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.691390 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c6efbdf-ccf5-4184-b7a1-8117f9133ed2" (UID: "8c6efbdf-ccf5-4184-b7a1-8117f9133ed2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.712771 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c6efbdf-ccf5-4184-b7a1-8117f9133ed2","Type":"ContainerDied","Data":"c0cb33c6854945f3e87a92848a4f615cc19c7228393247a8650e63c707ed8654"} Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.712832 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 09:28:43 crc kubenswrapper[4867]: E1201 09:28:43.715745 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-56qbp" podUID="a891c34b-01dc-4e65-ad1d-b21597555988" Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.716670 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-config-data" (OuterVolumeSpecName: "config-data") pod "8c6efbdf-ccf5-4184-b7a1-8117f9133ed2" (UID: "8c6efbdf-ccf5-4184-b7a1-8117f9133ed2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.718952 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8c6efbdf-ccf5-4184-b7a1-8117f9133ed2" (UID: "8c6efbdf-ccf5-4184-b7a1-8117f9133ed2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.762678 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvpvd\" (UniqueName: \"kubernetes.io/projected/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-kube-api-access-kvpvd\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.763035 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.763073 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.763103 4867 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.763116 4867 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.763129 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.788602 4867 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.865360 4867 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:43 crc kubenswrapper[4867]: E1201 09:28:43.940792 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 01 09:28:43 crc kubenswrapper[4867]: E1201 09:28:43.940980 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n595h68ch584h676h547h6fh95h5bbh5b6h5dh695h597h6dh57h566h599hdbh59bh685h567h5fh97h56bh5c7h5d4h564h5dch55h599h544h7fh65fq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hwqhj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(e0509290-ed5f-4982-bae7-8710f1eeb88f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:28:43 crc kubenswrapper[4867]: W1201 09:28:43.982962 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bd4fac2_df2c_4aab_bf00_99b54a83ddca.slice/crio-3a315891b1e46cf9b495747e2748a5df313c490eb138ca689f3e98182ef063ab WatchSource:0}: Error finding container 3a315891b1e46cf9b495747e2748a5df313c490eb138ca689f3e98182ef063ab: Status 404 returned error can't find the container with id 3a315891b1e46cf9b495747e2748a5df313c490eb138ca689f3e98182ef063ab Dec 01 09:28:43 crc kubenswrapper[4867]: I1201 09:28:43.987122 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-674b4ff7c9-r6p2x" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.022301 4867 scope.go:117] "RemoveContainer" containerID="f80f32ef76a1e5259200395738dece2f006f173adf48f9885430ab90a04d3e60" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.031975 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.035607 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56fc8957fc-4449t" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.046138 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7599b45569-t2jm7" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.069459 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ded298f7-555e-4f07-9125-0c2f78158131-horizon-secret-key\") pod \"ded298f7-555e-4f07-9125-0c2f78158131\" (UID: \"ded298f7-555e-4f07-9125-0c2f78158131\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.069759 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/88d30955-4c30-4000-b6c7-ab3414b9bb32-horizon-secret-key\") pod \"88d30955-4c30-4000-b6c7-ab3414b9bb32\" (UID: \"88d30955-4c30-4000-b6c7-ab3414b9bb32\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.069802 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkqbx\" (UniqueName: \"kubernetes.io/projected/d8882ae2-488c-41aa-96a2-b94cca902a3a-kube-api-access-nkqbx\") pod \"d8882ae2-488c-41aa-96a2-b94cca902a3a\" (UID: \"d8882ae2-488c-41aa-96a2-b94cca902a3a\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.069878 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl97c\" (UniqueName: \"kubernetes.io/projected/ded298f7-555e-4f07-9125-0c2f78158131-kube-api-access-wl97c\") pod \"ded298f7-555e-4f07-9125-0c2f78158131\" (UID: \"ded298f7-555e-4f07-9125-0c2f78158131\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.069919 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ded298f7-555e-4f07-9125-0c2f78158131-scripts\") pod \"ded298f7-555e-4f07-9125-0c2f78158131\" (UID: \"ded298f7-555e-4f07-9125-0c2f78158131\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.069940 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ded298f7-555e-4f07-9125-0c2f78158131-logs\") pod \"ded298f7-555e-4f07-9125-0c2f78158131\" (UID: \"ded298f7-555e-4f07-9125-0c2f78158131\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.069974 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ded298f7-555e-4f07-9125-0c2f78158131-config-data\") pod \"ded298f7-555e-4f07-9125-0c2f78158131\" (UID: \"ded298f7-555e-4f07-9125-0c2f78158131\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.070001 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-config\") pod \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.070016 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-dns-svc\") pod \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.070041 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-dns-swift-storage-0\") pod \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.070056 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88d30955-4c30-4000-b6c7-ab3414b9bb32-scripts\") pod \"88d30955-4c30-4000-b6c7-ab3414b9bb32\" (UID: \"88d30955-4c30-4000-b6c7-ab3414b9bb32\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.070089 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-ovsdbserver-nb\") pod \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.070111 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88d30955-4c30-4000-b6c7-ab3414b9bb32-logs\") pod \"88d30955-4c30-4000-b6c7-ab3414b9bb32\" (UID: \"88d30955-4c30-4000-b6c7-ab3414b9bb32\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.070140 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88d30955-4c30-4000-b6c7-ab3414b9bb32-config-data\") pod \"88d30955-4c30-4000-b6c7-ab3414b9bb32\" (UID: \"88d30955-4c30-4000-b6c7-ab3414b9bb32\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.070156 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-ovsdbserver-sb\") pod \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.070196 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2vzn\" (UniqueName: \"kubernetes.io/projected/88d30955-4c30-4000-b6c7-ab3414b9bb32-kube-api-access-q2vzn\") pod \"88d30955-4c30-4000-b6c7-ab3414b9bb32\" (UID: \"88d30955-4c30-4000-b6c7-ab3414b9bb32\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.070211 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8882ae2-488c-41aa-96a2-b94cca902a3a-logs\") pod \"d8882ae2-488c-41aa-96a2-b94cca902a3a\" (UID: \"d8882ae2-488c-41aa-96a2-b94cca902a3a\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.070228 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d8882ae2-488c-41aa-96a2-b94cca902a3a-horizon-secret-key\") pod \"d8882ae2-488c-41aa-96a2-b94cca902a3a\" (UID: \"d8882ae2-488c-41aa-96a2-b94cca902a3a\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.070279 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8882ae2-488c-41aa-96a2-b94cca902a3a-scripts\") pod \"d8882ae2-488c-41aa-96a2-b94cca902a3a\" (UID: \"d8882ae2-488c-41aa-96a2-b94cca902a3a\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.070298 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxtvm\" (UniqueName: \"kubernetes.io/projected/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-kube-api-access-nxtvm\") pod \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\" (UID: \"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.070314 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8882ae2-488c-41aa-96a2-b94cca902a3a-config-data\") pod \"d8882ae2-488c-41aa-96a2-b94cca902a3a\" (UID: \"d8882ae2-488c-41aa-96a2-b94cca902a3a\") " Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.070949 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88d30955-4c30-4000-b6c7-ab3414b9bb32-scripts" (OuterVolumeSpecName: "scripts") pod "88d30955-4c30-4000-b6c7-ab3414b9bb32" (UID: "88d30955-4c30-4000-b6c7-ab3414b9bb32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.071527 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ded298f7-555e-4f07-9125-0c2f78158131-logs" (OuterVolumeSpecName: "logs") pod "ded298f7-555e-4f07-9125-0c2f78158131" (UID: "ded298f7-555e-4f07-9125-0c2f78158131"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.073766 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ded298f7-555e-4f07-9125-0c2f78158131-config-data" (OuterVolumeSpecName: "config-data") pod "ded298f7-555e-4f07-9125-0c2f78158131" (UID: "ded298f7-555e-4f07-9125-0c2f78158131"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.074388 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8882ae2-488c-41aa-96a2-b94cca902a3a-logs" (OuterVolumeSpecName: "logs") pod "d8882ae2-488c-41aa-96a2-b94cca902a3a" (UID: "d8882ae2-488c-41aa-96a2-b94cca902a3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.077801 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8882ae2-488c-41aa-96a2-b94cca902a3a-scripts" (OuterVolumeSpecName: "scripts") pod "d8882ae2-488c-41aa-96a2-b94cca902a3a" (UID: "d8882ae2-488c-41aa-96a2-b94cca902a3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.080965 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ded298f7-555e-4f07-9125-0c2f78158131-scripts" (OuterVolumeSpecName: "scripts") pod "ded298f7-555e-4f07-9125-0c2f78158131" (UID: "ded298f7-555e-4f07-9125-0c2f78158131"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.081555 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8882ae2-488c-41aa-96a2-b94cca902a3a-config-data" (OuterVolumeSpecName: "config-data") pod "d8882ae2-488c-41aa-96a2-b94cca902a3a" (UID: "d8882ae2-488c-41aa-96a2-b94cca902a3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.081600 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88d30955-4c30-4000-b6c7-ab3414b9bb32-config-data" (OuterVolumeSpecName: "config-data") pod "88d30955-4c30-4000-b6c7-ab3414b9bb32" (UID: "88d30955-4c30-4000-b6c7-ab3414b9bb32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.082085 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88d30955-4c30-4000-b6c7-ab3414b9bb32-logs" (OuterVolumeSpecName: "logs") pod "88d30955-4c30-4000-b6c7-ab3414b9bb32" (UID: "88d30955-4c30-4000-b6c7-ab3414b9bb32"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.087325 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88d30955-4c30-4000-b6c7-ab3414b9bb32-kube-api-access-q2vzn" (OuterVolumeSpecName: "kube-api-access-q2vzn") pod "88d30955-4c30-4000-b6c7-ab3414b9bb32" (UID: "88d30955-4c30-4000-b6c7-ab3414b9bb32"). InnerVolumeSpecName "kube-api-access-q2vzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.087414 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ded298f7-555e-4f07-9125-0c2f78158131-kube-api-access-wl97c" (OuterVolumeSpecName: "kube-api-access-wl97c") pod "ded298f7-555e-4f07-9125-0c2f78158131" (UID: "ded298f7-555e-4f07-9125-0c2f78158131"). InnerVolumeSpecName "kube-api-access-wl97c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.087449 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d30955-4c30-4000-b6c7-ab3414b9bb32-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "88d30955-4c30-4000-b6c7-ab3414b9bb32" (UID: "88d30955-4c30-4000-b6c7-ab3414b9bb32"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.087487 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded298f7-555e-4f07-9125-0c2f78158131-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ded298f7-555e-4f07-9125-0c2f78158131" (UID: "ded298f7-555e-4f07-9125-0c2f78158131"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.093652 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8882ae2-488c-41aa-96a2-b94cca902a3a-kube-api-access-nkqbx" (OuterVolumeSpecName: "kube-api-access-nkqbx") pod "d8882ae2-488c-41aa-96a2-b94cca902a3a" (UID: "d8882ae2-488c-41aa-96a2-b94cca902a3a"). InnerVolumeSpecName "kube-api-access-nkqbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.121020 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8882ae2-488c-41aa-96a2-b94cca902a3a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d8882ae2-488c-41aa-96a2-b94cca902a3a" (UID: "d8882ae2-488c-41aa-96a2-b94cca902a3a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.121097 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-kube-api-access-nxtvm" (OuterVolumeSpecName: "kube-api-access-nxtvm") pod "604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" (UID: "604c66a1-4f1b-49a5-a4f1-cb4b2405a97a"). InnerVolumeSpecName "kube-api-access-nxtvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.143348 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" (UID: "604c66a1-4f1b-49a5-a4f1-cb4b2405a97a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.174085 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8882ae2-488c-41aa-96a2-b94cca902a3a-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.174112 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxtvm\" (UniqueName: \"kubernetes.io/projected/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-kube-api-access-nxtvm\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.174130 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8882ae2-488c-41aa-96a2-b94cca902a3a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.174141 4867 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ded298f7-555e-4f07-9125-0c2f78158131-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.174160 4867 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/88d30955-4c30-4000-b6c7-ab3414b9bb32-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.174168 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkqbx\" (UniqueName: \"kubernetes.io/projected/d8882ae2-488c-41aa-96a2-b94cca902a3a-kube-api-access-nkqbx\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.174176 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl97c\" (UniqueName: \"kubernetes.io/projected/ded298f7-555e-4f07-9125-0c2f78158131-kube-api-access-wl97c\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.174185 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ded298f7-555e-4f07-9125-0c2f78158131-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.174192 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ded298f7-555e-4f07-9125-0c2f78158131-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.174200 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ded298f7-555e-4f07-9125-0c2f78158131-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.174207 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.174215 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88d30955-4c30-4000-b6c7-ab3414b9bb32-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.174222 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88d30955-4c30-4000-b6c7-ab3414b9bb32-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.174231 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88d30955-4c30-4000-b6c7-ab3414b9bb32-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.174240 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2vzn\" (UniqueName: \"kubernetes.io/projected/88d30955-4c30-4000-b6c7-ab3414b9bb32-kube-api-access-q2vzn\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.174416 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8882ae2-488c-41aa-96a2-b94cca902a3a-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.174432 4867 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d8882ae2-488c-41aa-96a2-b94cca902a3a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.199574 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-config" (OuterVolumeSpecName: "config") pod "604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" (UID: "604c66a1-4f1b-49a5-a4f1-cb4b2405a97a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.229897 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" (UID: "604c66a1-4f1b-49a5-a4f1-cb4b2405a97a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.229617 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" (UID: "604c66a1-4f1b-49a5-a4f1-cb4b2405a97a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.238054 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" (UID: "604c66a1-4f1b-49a5-a4f1-cb4b2405a97a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.240838 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.247767 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.261938 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 09:28:44 crc kubenswrapper[4867]: E1201 09:28:44.262356 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" containerName="init" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.262374 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" containerName="init" Dec 01 09:28:44 crc kubenswrapper[4867]: E1201 09:28:44.262388 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6efbdf-ccf5-4184-b7a1-8117f9133ed2" containerName="glance-log" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.262395 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6efbdf-ccf5-4184-b7a1-8117f9133ed2" containerName="glance-log" Dec 01 09:28:44 crc kubenswrapper[4867]: E1201 09:28:44.262413 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" containerName="dnsmasq-dns" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.262419 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" containerName="dnsmasq-dns" Dec 01 09:28:44 crc kubenswrapper[4867]: E1201 09:28:44.262433 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6efbdf-ccf5-4184-b7a1-8117f9133ed2" containerName="glance-httpd" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.262441 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6efbdf-ccf5-4184-b7a1-8117f9133ed2" containerName="glance-httpd" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.262621 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" containerName="dnsmasq-dns" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.262641 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c6efbdf-ccf5-4184-b7a1-8117f9133ed2" containerName="glance-httpd" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.262657 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c6efbdf-ccf5-4184-b7a1-8117f9133ed2" containerName="glance-log" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.263578 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.271353 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.271873 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.275441 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.275559 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-config-data\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.275577 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.275597 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q5tf\" (UniqueName: \"kubernetes.io/projected/94647b4f-f18c-4010-8573-b36075f21ecc-kube-api-access-8q5tf\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.275619 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94647b4f-f18c-4010-8573-b36075f21ecc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.275638 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-scripts\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.275655 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.275678 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94647b4f-f18c-4010-8573-b36075f21ecc-logs\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.275726 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.275736 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.275745 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.275756 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.279443 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.378867 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-config-data\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.379344 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.379396 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q5tf\" (UniqueName: \"kubernetes.io/projected/94647b4f-f18c-4010-8573-b36075f21ecc-kube-api-access-8q5tf\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.379430 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94647b4f-f18c-4010-8573-b36075f21ecc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.379450 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-scripts\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.379470 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.379504 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94647b4f-f18c-4010-8573-b36075f21ecc-logs\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.379545 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.380075 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.392350 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94647b4f-f18c-4010-8573-b36075f21ecc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.392368 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94647b4f-f18c-4010-8573-b36075f21ecc-logs\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.429960 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-scripts\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.430837 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.436710 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.455697 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.456049 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q5tf\" (UniqueName: \"kubernetes.io/projected/94647b4f-f18c-4010-8573-b36075f21ecc-kube-api-access-8q5tf\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.476453 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-config-data\") pod \"glance-default-external-api-0\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.583729 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c846795f4-k7mlj"] Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.586327 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.724506 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7599b45569-t2jm7" event={"ID":"d8882ae2-488c-41aa-96a2-b94cca902a3a","Type":"ContainerDied","Data":"bc7ac0b44fb97deb41cc686450e7e7607346bb8eddc2667fae2588d187015699"} Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.724538 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7599b45569-t2jm7" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.729306 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" event={"ID":"604c66a1-4f1b-49a5-a4f1-cb4b2405a97a","Type":"ContainerDied","Data":"82dcdb9e11cc79c809628959b5150232686fd41007be8714f8d83c56e8a5f1f7"} Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.729349 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.730978 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d47c7cb76-srf4p" event={"ID":"8bd4fac2-df2c-4aab-bf00-99b54a83ddca","Type":"ContainerStarted","Data":"3a315891b1e46cf9b495747e2748a5df313c490eb138ca689f3e98182ef063ab"} Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.732194 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-674b4ff7c9-r6p2x" event={"ID":"ded298f7-555e-4f07-9125-0c2f78158131","Type":"ContainerDied","Data":"a27f2bce5627a525ec19ca0caecdbecaeb604c16af3e8382a5d83d8fae5317a7"} Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.732219 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-674b4ff7c9-r6p2x" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.736272 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56fc8957fc-4449t" event={"ID":"88d30955-4c30-4000-b6c7-ab3414b9bb32","Type":"ContainerDied","Data":"4fe8b1d25112b2c4ebfc3188d41b3397ac3f9dbf1e5cf5223160037be0ee2940"} Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.736330 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56fc8957fc-4449t" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.776357 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-k9s48"] Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.782690 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-k9s48"] Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.845207 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" path="/var/lib/kubelet/pods/604c66a1-4f1b-49a5-a4f1-cb4b2405a97a/volumes" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.846717 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c6efbdf-ccf5-4184-b7a1-8117f9133ed2" path="/var/lib/kubelet/pods/8c6efbdf-ccf5-4184-b7a1-8117f9133ed2/volumes" Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.852972 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-56fc8957fc-4449t"] Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.853216 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-56fc8957fc-4449t"] Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.867676 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7599b45569-t2jm7"] Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.911532 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7599b45569-t2jm7"] Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.926193 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-674b4ff7c9-r6p2x"] Dec 01 09:28:44 crc kubenswrapper[4867]: I1201 09:28:44.933049 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-674b4ff7c9-r6p2x"] Dec 01 09:28:45 crc kubenswrapper[4867]: W1201 09:28:45.925043 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3ec81b7_2197_4dfb_8865_9414f0cdfc6e.slice/crio-d1ea359bdea40daddb2690ec637a7978a3695bd78dd5c8829fbfee213f06285c WatchSource:0}: Error finding container d1ea359bdea40daddb2690ec637a7978a3695bd78dd5c8829fbfee213f06285c: Status 404 returned error can't find the container with id d1ea359bdea40daddb2690ec637a7978a3695bd78dd5c8829fbfee213f06285c Dec 01 09:28:45 crc kubenswrapper[4867]: E1201 09:28:45.959881 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 01 09:28:45 crc kubenswrapper[4867]: E1201 09:28:45.960037 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n8jtw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-xmtc6_openstack(65b95ca9-4891-4e69-a789-a21549f94247): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:28:45 crc kubenswrapper[4867]: E1201 09:28:45.961277 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-xmtc6" podUID="65b95ca9-4891-4e69-a789-a21549f94247" Dec 01 09:28:45 crc kubenswrapper[4867]: I1201 09:28:45.974997 4867 scope.go:117] "RemoveContainer" containerID="852c33f1692d9de3d928ef6486d77bbd86b85717f3b8d1ea44ad0419e08963a3" Dec 01 09:28:46 crc kubenswrapper[4867]: I1201 09:28:46.112162 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-k9s48" podUID="604c66a1-4f1b-49a5-a4f1-cb4b2405a97a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.124:5353: i/o timeout" Dec 01 09:28:46 crc kubenswrapper[4867]: I1201 09:28:46.122276 4867 scope.go:117] "RemoveContainer" containerID="c7a54dad3df8a3b7ee7088d0da50f65eea913be27ca846d01c8b7e688dd22649" Dec 01 09:28:46 crc kubenswrapper[4867]: I1201 09:28:46.194725 4867 scope.go:117] "RemoveContainer" containerID="9adfed2bde346b21fa4adfc36b9773c70edefc16f3d71c0ef87af1899b53d9d2" Dec 01 09:28:46 crc kubenswrapper[4867]: I1201 09:28:46.231269 4867 scope.go:117] "RemoveContainer" containerID="465819b356223a4bd0035c262d41cfb3ae8635231e1dde776d2e5aabb5239c02" Dec 01 09:28:46 crc kubenswrapper[4867]: I1201 09:28:46.473335 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dq766"] Dec 01 09:28:46 crc kubenswrapper[4867]: I1201 09:28:46.658294 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 09:28:46 crc kubenswrapper[4867]: I1201 09:28:46.753870 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c846795f4-k7mlj" event={"ID":"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e","Type":"ContainerStarted","Data":"d1ea359bdea40daddb2690ec637a7978a3695bd78dd5c8829fbfee213f06285c"} Dec 01 09:28:46 crc kubenswrapper[4867]: I1201 09:28:46.756350 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-phzxd" event={"ID":"fb0c725b-663d-4764-b156-9426923ce046","Type":"ContainerStarted","Data":"b3aafaeb547edff8fcedb18f00fdc75d9dcd19277509071c41c71c261000a533"} Dec 01 09:28:46 crc kubenswrapper[4867]: I1201 09:28:46.763535 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d47c7cb76-srf4p" event={"ID":"8bd4fac2-df2c-4aab-bf00-99b54a83ddca","Type":"ContainerStarted","Data":"9d03af7b1362790fa6ac6592121987809cbd79e15e394cba2fd458a1d0946120"} Dec 01 09:28:46 crc kubenswrapper[4867]: I1201 09:28:46.763581 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d47c7cb76-srf4p" event={"ID":"8bd4fac2-df2c-4aab-bf00-99b54a83ddca","Type":"ContainerStarted","Data":"7902fe315a7ed47f08cd785d91c6824fb96607b18782da76479c24f02dfe655d"} Dec 01 09:28:46 crc kubenswrapper[4867]: I1201 09:28:46.789169 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-phzxd" podStartSLOduration=5.576481911 podStartE2EDuration="44.789149021s" podCreationTimestamp="2025-12-01 09:28:02 +0000 UTC" firstStartedPulling="2025-12-01 09:28:04.809359226 +0000 UTC m=+1206.268745970" lastFinishedPulling="2025-12-01 09:28:44.022026326 +0000 UTC m=+1245.481413080" observedRunningTime="2025-12-01 09:28:46.78251142 +0000 UTC m=+1248.241898174" watchObservedRunningTime="2025-12-01 09:28:46.789149021 +0000 UTC m=+1248.248535775" Dec 01 09:28:46 crc kubenswrapper[4867]: E1201 09:28:46.802446 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-xmtc6" podUID="65b95ca9-4891-4e69-a789-a21549f94247" Dec 01 09:28:46 crc kubenswrapper[4867]: I1201 09:28:46.803283 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-d47c7cb76-srf4p" podStartSLOduration=32.732525024 podStartE2EDuration="34.803260807s" podCreationTimestamp="2025-12-01 09:28:12 +0000 UTC" firstStartedPulling="2025-12-01 09:28:44.022436467 +0000 UTC m=+1245.481823221" lastFinishedPulling="2025-12-01 09:28:46.09317225 +0000 UTC m=+1247.552559004" observedRunningTime="2025-12-01 09:28:46.799399942 +0000 UTC m=+1248.258786706" watchObservedRunningTime="2025-12-01 09:28:46.803260807 +0000 UTC m=+1248.262647561" Dec 01 09:28:46 crc kubenswrapper[4867]: I1201 09:28:46.840911 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88d30955-4c30-4000-b6c7-ab3414b9bb32" path="/var/lib/kubelet/pods/88d30955-4c30-4000-b6c7-ab3414b9bb32/volumes" Dec 01 09:28:46 crc kubenswrapper[4867]: I1201 09:28:46.841441 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8882ae2-488c-41aa-96a2-b94cca902a3a" path="/var/lib/kubelet/pods/d8882ae2-488c-41aa-96a2-b94cca902a3a/volumes" Dec 01 09:28:46 crc kubenswrapper[4867]: I1201 09:28:46.841989 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ded298f7-555e-4f07-9125-0c2f78158131" path="/var/lib/kubelet/pods/ded298f7-555e-4f07-9125-0c2f78158131/volumes" Dec 01 09:28:46 crc kubenswrapper[4867]: W1201 09:28:46.849967 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1179a242_dbf2_4bc1_888b_33f22df356a6.slice/crio-6c5abe63eb3b8ad76da57393c4e4eaade98c513750a7bc5630ca5bbcfd323d8f WatchSource:0}: Error finding container 6c5abe63eb3b8ad76da57393c4e4eaade98c513750a7bc5630ca5bbcfd323d8f: Status 404 returned error can't find the container with id 6c5abe63eb3b8ad76da57393c4e4eaade98c513750a7bc5630ca5bbcfd323d8f Dec 01 09:28:46 crc kubenswrapper[4867]: W1201 09:28:46.851663 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdcd8107_dd0c_494b_b6ee_93fc8f3d6933.slice/crio-2689b4f51f3570b22e3ba54bb1c31fe044bd20014b13226576740c54be47ecbc WatchSource:0}: Error finding container 2689b4f51f3570b22e3ba54bb1c31fe044bd20014b13226576740c54be47ecbc: Status 404 returned error can't find the container with id 2689b4f51f3570b22e3ba54bb1c31fe044bd20014b13226576740c54be47ecbc Dec 01 09:28:47 crc kubenswrapper[4867]: I1201 09:28:47.398415 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 09:28:47 crc kubenswrapper[4867]: W1201 09:28:47.400284 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94647b4f_f18c_4010_8573_b36075f21ecc.slice/crio-7ae1dee8772bc91bb3849fd8f65efd07d7b99781063fc5fe55ca9ed2a7cdee89 WatchSource:0}: Error finding container 7ae1dee8772bc91bb3849fd8f65efd07d7b99781063fc5fe55ca9ed2a7cdee89: Status 404 returned error can't find the container with id 7ae1dee8772bc91bb3849fd8f65efd07d7b99781063fc5fe55ca9ed2a7cdee89 Dec 01 09:28:47 crc kubenswrapper[4867]: I1201 09:28:47.831170 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1179a242-dbf2-4bc1-888b-33f22df356a6","Type":"ContainerStarted","Data":"be9c99f1cc0cad5bda870353e0af813dd03a1eacd33fa1fe8d450a2d291b1798"} Dec 01 09:28:47 crc kubenswrapper[4867]: I1201 09:28:47.831207 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1179a242-dbf2-4bc1-888b-33f22df356a6","Type":"ContainerStarted","Data":"6c5abe63eb3b8ad76da57393c4e4eaade98c513750a7bc5630ca5bbcfd323d8f"} Dec 01 09:28:47 crc kubenswrapper[4867]: I1201 09:28:47.834083 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c846795f4-k7mlj" event={"ID":"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e","Type":"ContainerStarted","Data":"686c5303f0412b7b582b8c491b3e8223fe86fdd2e4836a2991c0f50fae8a3067"} Dec 01 09:28:47 crc kubenswrapper[4867]: I1201 09:28:47.834111 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c846795f4-k7mlj" event={"ID":"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e","Type":"ContainerStarted","Data":"7d407d773adde4e56125d8367a5647a70061e41451fee16defd9704d33938a4f"} Dec 01 09:28:47 crc kubenswrapper[4867]: I1201 09:28:47.839178 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dq766" event={"ID":"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933","Type":"ContainerStarted","Data":"3dde5d99cc2da2d38dca5a3796b9183ab71d8574bd8d931f6f284fcbd5788fac"} Dec 01 09:28:47 crc kubenswrapper[4867]: I1201 09:28:47.839208 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dq766" event={"ID":"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933","Type":"ContainerStarted","Data":"2689b4f51f3570b22e3ba54bb1c31fe044bd20014b13226576740c54be47ecbc"} Dec 01 09:28:47 crc kubenswrapper[4867]: I1201 09:28:47.843009 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0509290-ed5f-4982-bae7-8710f1eeb88f","Type":"ContainerStarted","Data":"5594fca1ae40081291997daa714acc24d25714466ae4978c2e34205fd9c3d19e"} Dec 01 09:28:47 crc kubenswrapper[4867]: I1201 09:28:47.844900 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"94647b4f-f18c-4010-8573-b36075f21ecc","Type":"ContainerStarted","Data":"7ae1dee8772bc91bb3849fd8f65efd07d7b99781063fc5fe55ca9ed2a7cdee89"} Dec 01 09:28:47 crc kubenswrapper[4867]: I1201 09:28:47.859669 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-c846795f4-k7mlj" podStartSLOduration=34.811907463 podStartE2EDuration="35.859652854s" podCreationTimestamp="2025-12-01 09:28:12 +0000 UTC" firstStartedPulling="2025-12-01 09:28:45.932793419 +0000 UTC m=+1247.392180173" lastFinishedPulling="2025-12-01 09:28:46.98053881 +0000 UTC m=+1248.439925564" observedRunningTime="2025-12-01 09:28:47.854139133 +0000 UTC m=+1249.313525887" watchObservedRunningTime="2025-12-01 09:28:47.859652854 +0000 UTC m=+1249.319039608" Dec 01 09:28:47 crc kubenswrapper[4867]: I1201 09:28:47.874698 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-dq766" podStartSLOduration=28.874681196 podStartE2EDuration="28.874681196s" podCreationTimestamp="2025-12-01 09:28:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:28:47.867586862 +0000 UTC m=+1249.326973616" watchObservedRunningTime="2025-12-01 09:28:47.874681196 +0000 UTC m=+1249.334067950" Dec 01 09:28:48 crc kubenswrapper[4867]: I1201 09:28:48.859215 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1179a242-dbf2-4bc1-888b-33f22df356a6","Type":"ContainerStarted","Data":"86061cced9fc07099d9d68d5d3403011f3b13c3c5b735f6f29f779c23f098583"} Dec 01 09:28:48 crc kubenswrapper[4867]: I1201 09:28:48.868041 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"94647b4f-f18c-4010-8573-b36075f21ecc","Type":"ContainerStarted","Data":"e2e2a867e1f7c3c2c7373d2c831ddca3c8248d74e8fb2a26196b9274e44e5b48"} Dec 01 09:28:48 crc kubenswrapper[4867]: I1201 09:28:48.868082 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"94647b4f-f18c-4010-8573-b36075f21ecc","Type":"ContainerStarted","Data":"c83a226124ce3e9eb3d129b3bc5a6eec55ad521347829f347ddc31dbabf161a7"} Dec 01 09:28:48 crc kubenswrapper[4867]: I1201 09:28:48.871309 4867 generic.go:334] "Generic (PLEG): container finished" podID="b0894df6-2174-4d0a-9e26-93650fd0e925" containerID="738e1ba32f0935e93690fe3ae7b01f58519ac3f31f26a9814957c63353df460f" exitCode=0 Dec 01 09:28:48 crc kubenswrapper[4867]: I1201 09:28:48.874382 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s52lq" event={"ID":"b0894df6-2174-4d0a-9e26-93650fd0e925","Type":"ContainerDied","Data":"738e1ba32f0935e93690fe3ae7b01f58519ac3f31f26a9814957c63353df460f"} Dec 01 09:28:48 crc kubenswrapper[4867]: I1201 09:28:48.953102 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=30.953084766 podStartE2EDuration="30.953084766s" podCreationTimestamp="2025-12-01 09:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:28:48.950838494 +0000 UTC m=+1250.410225258" watchObservedRunningTime="2025-12-01 09:28:48.953084766 +0000 UTC m=+1250.412471530" Dec 01 09:28:48 crc kubenswrapper[4867]: I1201 09:28:48.987984 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.98796233 podStartE2EDuration="4.98796233s" podCreationTimestamp="2025-12-01 09:28:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:28:48.979348274 +0000 UTC m=+1250.438735048" watchObservedRunningTime="2025-12-01 09:28:48.98796233 +0000 UTC m=+1250.447349084" Dec 01 09:28:49 crc kubenswrapper[4867]: I1201 09:28:49.139421 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 09:28:49 crc kubenswrapper[4867]: I1201 09:28:49.139937 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 09:28:49 crc kubenswrapper[4867]: I1201 09:28:49.139952 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 09:28:49 crc kubenswrapper[4867]: I1201 09:28:49.139964 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 09:28:49 crc kubenswrapper[4867]: I1201 09:28:49.191383 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 09:28:49 crc kubenswrapper[4867]: I1201 09:28:49.308656 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 09:28:49 crc kubenswrapper[4867]: I1201 09:28:49.881515 4867 generic.go:334] "Generic (PLEG): container finished" podID="fb0c725b-663d-4764-b156-9426923ce046" containerID="b3aafaeb547edff8fcedb18f00fdc75d9dcd19277509071c41c71c261000a533" exitCode=0 Dec 01 09:28:49 crc kubenswrapper[4867]: I1201 09:28:49.881563 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-phzxd" event={"ID":"fb0c725b-663d-4764-b156-9426923ce046","Type":"ContainerDied","Data":"b3aafaeb547edff8fcedb18f00fdc75d9dcd19277509071c41c71c261000a533"} Dec 01 09:28:50 crc kubenswrapper[4867]: I1201 09:28:50.287362 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s52lq" Dec 01 09:28:50 crc kubenswrapper[4867]: I1201 09:28:50.416591 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0894df6-2174-4d0a-9e26-93650fd0e925-combined-ca-bundle\") pod \"b0894df6-2174-4d0a-9e26-93650fd0e925\" (UID: \"b0894df6-2174-4d0a-9e26-93650fd0e925\") " Dec 01 09:28:50 crc kubenswrapper[4867]: I1201 09:28:50.416689 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b0894df6-2174-4d0a-9e26-93650fd0e925-config\") pod \"b0894df6-2174-4d0a-9e26-93650fd0e925\" (UID: \"b0894df6-2174-4d0a-9e26-93650fd0e925\") " Dec 01 09:28:50 crc kubenswrapper[4867]: I1201 09:28:50.416720 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd29t\" (UniqueName: \"kubernetes.io/projected/b0894df6-2174-4d0a-9e26-93650fd0e925-kube-api-access-rd29t\") pod \"b0894df6-2174-4d0a-9e26-93650fd0e925\" (UID: \"b0894df6-2174-4d0a-9e26-93650fd0e925\") " Dec 01 09:28:50 crc kubenswrapper[4867]: I1201 09:28:50.440571 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0894df6-2174-4d0a-9e26-93650fd0e925-kube-api-access-rd29t" (OuterVolumeSpecName: "kube-api-access-rd29t") pod "b0894df6-2174-4d0a-9e26-93650fd0e925" (UID: "b0894df6-2174-4d0a-9e26-93650fd0e925"). InnerVolumeSpecName "kube-api-access-rd29t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:28:50 crc kubenswrapper[4867]: I1201 09:28:50.443974 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0894df6-2174-4d0a-9e26-93650fd0e925-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0894df6-2174-4d0a-9e26-93650fd0e925" (UID: "b0894df6-2174-4d0a-9e26-93650fd0e925"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:50 crc kubenswrapper[4867]: I1201 09:28:50.448913 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0894df6-2174-4d0a-9e26-93650fd0e925-config" (OuterVolumeSpecName: "config") pod "b0894df6-2174-4d0a-9e26-93650fd0e925" (UID: "b0894df6-2174-4d0a-9e26-93650fd0e925"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:50 crc kubenswrapper[4867]: I1201 09:28:50.518701 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0894df6-2174-4d0a-9e26-93650fd0e925-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:50 crc kubenswrapper[4867]: I1201 09:28:50.518736 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b0894df6-2174-4d0a-9e26-93650fd0e925-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:50 crc kubenswrapper[4867]: I1201 09:28:50.518749 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd29t\" (UniqueName: \"kubernetes.io/projected/b0894df6-2174-4d0a-9e26-93650fd0e925-kube-api-access-rd29t\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:50 crc kubenswrapper[4867]: I1201 09:28:50.894352 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s52lq" event={"ID":"b0894df6-2174-4d0a-9e26-93650fd0e925","Type":"ContainerDied","Data":"927e7126f3920e5505f19debebdfa8f66b1cea57443989f2bf436c3c3dd6fb0b"} Dec 01 09:28:50 crc kubenswrapper[4867]: I1201 09:28:50.894395 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="927e7126f3920e5505f19debebdfa8f66b1cea57443989f2bf436c3c3dd6fb0b" Dec 01 09:28:50 crc kubenswrapper[4867]: I1201 09:28:50.894404 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s52lq" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.262898 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-s7kjh"] Dec 01 09:28:51 crc kubenswrapper[4867]: E1201 09:28:51.263512 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0894df6-2174-4d0a-9e26-93650fd0e925" containerName="neutron-db-sync" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.263530 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0894df6-2174-4d0a-9e26-93650fd0e925" containerName="neutron-db-sync" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.263682 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0894df6-2174-4d0a-9e26-93650fd0e925" containerName="neutron-db-sync" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.264541 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.332747 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-s7kjh\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.332834 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-config\") pod \"dnsmasq-dns-55f844cf75-s7kjh\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.333095 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-s7kjh\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.333207 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-dns-svc\") pod \"dnsmasq-dns-55f844cf75-s7kjh\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.333299 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mzrl\" (UniqueName: \"kubernetes.io/projected/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-kube-api-access-6mzrl\") pod \"dnsmasq-dns-55f844cf75-s7kjh\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.333355 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-s7kjh\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.371327 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-s7kjh"] Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.434938 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-config\") pod \"dnsmasq-dns-55f844cf75-s7kjh\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.435077 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-s7kjh\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.435132 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-dns-svc\") pod \"dnsmasq-dns-55f844cf75-s7kjh\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.435177 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mzrl\" (UniqueName: \"kubernetes.io/projected/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-kube-api-access-6mzrl\") pod \"dnsmasq-dns-55f844cf75-s7kjh\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.435221 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-s7kjh\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.435273 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-s7kjh\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.436976 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-config\") pod \"dnsmasq-dns-55f844cf75-s7kjh\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.437772 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-dns-svc\") pod \"dnsmasq-dns-55f844cf75-s7kjh\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.438363 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-s7kjh\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.439197 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-s7kjh\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.441557 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-s7kjh\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.498655 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mzrl\" (UniqueName: \"kubernetes.io/projected/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-kube-api-access-6mzrl\") pod \"dnsmasq-dns-55f844cf75-s7kjh\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.622290 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.716701 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7df757454-fcwhf"] Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.718298 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.721325 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.721585 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.721797 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4smxn" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.722032 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.744435 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-ovndb-tls-certs\") pod \"neutron-7df757454-fcwhf\" (UID: \"35ebb9d5-af37-425c-b29b-c4f98eab213a\") " pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.744472 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-config\") pod \"neutron-7df757454-fcwhf\" (UID: \"35ebb9d5-af37-425c-b29b-c4f98eab213a\") " pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.744495 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mctk\" (UniqueName: \"kubernetes.io/projected/35ebb9d5-af37-425c-b29b-c4f98eab213a-kube-api-access-5mctk\") pod \"neutron-7df757454-fcwhf\" (UID: \"35ebb9d5-af37-425c-b29b-c4f98eab213a\") " pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.744590 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-httpd-config\") pod \"neutron-7df757454-fcwhf\" (UID: \"35ebb9d5-af37-425c-b29b-c4f98eab213a\") " pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.744611 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-combined-ca-bundle\") pod \"neutron-7df757454-fcwhf\" (UID: \"35ebb9d5-af37-425c-b29b-c4f98eab213a\") " pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.848138 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-config\") pod \"neutron-7df757454-fcwhf\" (UID: \"35ebb9d5-af37-425c-b29b-c4f98eab213a\") " pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.848303 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mctk\" (UniqueName: \"kubernetes.io/projected/35ebb9d5-af37-425c-b29b-c4f98eab213a-kube-api-access-5mctk\") pod \"neutron-7df757454-fcwhf\" (UID: \"35ebb9d5-af37-425c-b29b-c4f98eab213a\") " pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.848392 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-httpd-config\") pod \"neutron-7df757454-fcwhf\" (UID: \"35ebb9d5-af37-425c-b29b-c4f98eab213a\") " pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.848406 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-combined-ca-bundle\") pod \"neutron-7df757454-fcwhf\" (UID: \"35ebb9d5-af37-425c-b29b-c4f98eab213a\") " pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.848451 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-ovndb-tls-certs\") pod \"neutron-7df757454-fcwhf\" (UID: \"35ebb9d5-af37-425c-b29b-c4f98eab213a\") " pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.854726 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-config\") pod \"neutron-7df757454-fcwhf\" (UID: \"35ebb9d5-af37-425c-b29b-c4f98eab213a\") " pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.855269 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-httpd-config\") pod \"neutron-7df757454-fcwhf\" (UID: \"35ebb9d5-af37-425c-b29b-c4f98eab213a\") " pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.856235 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-ovndb-tls-certs\") pod \"neutron-7df757454-fcwhf\" (UID: \"35ebb9d5-af37-425c-b29b-c4f98eab213a\") " pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.856563 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-combined-ca-bundle\") pod \"neutron-7df757454-fcwhf\" (UID: \"35ebb9d5-af37-425c-b29b-c4f98eab213a\") " pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.910454 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7df757454-fcwhf"] Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.933604 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mctk\" (UniqueName: \"kubernetes.io/projected/35ebb9d5-af37-425c-b29b-c4f98eab213a-kube-api-access-5mctk\") pod \"neutron-7df757454-fcwhf\" (UID: \"35ebb9d5-af37-425c-b29b-c4f98eab213a\") " pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.965336 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-phzxd" Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.967321 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-phzxd" event={"ID":"fb0c725b-663d-4764-b156-9426923ce046","Type":"ContainerDied","Data":"601a7deec5b6de487d05d80edeea804fe628d8989aa388578fded2fb02f45cdd"} Dec 01 09:28:51 crc kubenswrapper[4867]: I1201 09:28:51.967349 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="601a7deec5b6de487d05d80edeea804fe628d8989aa388578fded2fb02f45cdd" Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.051636 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb0c725b-663d-4764-b156-9426923ce046-logs\") pod \"fb0c725b-663d-4764-b156-9426923ce046\" (UID: \"fb0c725b-663d-4764-b156-9426923ce046\") " Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.051681 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb0c725b-663d-4764-b156-9426923ce046-scripts\") pod \"fb0c725b-663d-4764-b156-9426923ce046\" (UID: \"fb0c725b-663d-4764-b156-9426923ce046\") " Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.051741 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0c725b-663d-4764-b156-9426923ce046-config-data\") pod \"fb0c725b-663d-4764-b156-9426923ce046\" (UID: \"fb0c725b-663d-4764-b156-9426923ce046\") " Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.051764 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0c725b-663d-4764-b156-9426923ce046-combined-ca-bundle\") pod \"fb0c725b-663d-4764-b156-9426923ce046\" (UID: \"fb0c725b-663d-4764-b156-9426923ce046\") " Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.051947 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxhnb\" (UniqueName: \"kubernetes.io/projected/fb0c725b-663d-4764-b156-9426923ce046-kube-api-access-vxhnb\") pod \"fb0c725b-663d-4764-b156-9426923ce046\" (UID: \"fb0c725b-663d-4764-b156-9426923ce046\") " Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.054111 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb0c725b-663d-4764-b156-9426923ce046-logs" (OuterVolumeSpecName: "logs") pod "fb0c725b-663d-4764-b156-9426923ce046" (UID: "fb0c725b-663d-4764-b156-9426923ce046"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.059328 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0c725b-663d-4764-b156-9426923ce046-scripts" (OuterVolumeSpecName: "scripts") pod "fb0c725b-663d-4764-b156-9426923ce046" (UID: "fb0c725b-663d-4764-b156-9426923ce046"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.059858 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb0c725b-663d-4764-b156-9426923ce046-kube-api-access-vxhnb" (OuterVolumeSpecName: "kube-api-access-vxhnb") pod "fb0c725b-663d-4764-b156-9426923ce046" (UID: "fb0c725b-663d-4764-b156-9426923ce046"). InnerVolumeSpecName "kube-api-access-vxhnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.099953 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0c725b-663d-4764-b156-9426923ce046-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb0c725b-663d-4764-b156-9426923ce046" (UID: "fb0c725b-663d-4764-b156-9426923ce046"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.104608 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0c725b-663d-4764-b156-9426923ce046-config-data" (OuterVolumeSpecName: "config-data") pod "fb0c725b-663d-4764-b156-9426923ce046" (UID: "fb0c725b-663d-4764-b156-9426923ce046"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.134100 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.179946 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxhnb\" (UniqueName: \"kubernetes.io/projected/fb0c725b-663d-4764-b156-9426923ce046-kube-api-access-vxhnb\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.179978 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb0c725b-663d-4764-b156-9426923ce046-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.179989 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb0c725b-663d-4764-b156-9426923ce046-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.179998 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0c725b-663d-4764-b156-9426923ce046-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.180012 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0c725b-663d-4764-b156-9426923ce046-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.393494 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-s7kjh"] Dec 01 09:28:52 crc kubenswrapper[4867]: W1201 09:28:52.400369 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e8123ef_03c8_4e49_b631_d4d90f54c8d0.slice/crio-86768a8b72610851b6e26224864546d5e4b377e983ed4685cbea74a8648db0b1 WatchSource:0}: Error finding container 86768a8b72610851b6e26224864546d5e4b377e983ed4685cbea74a8648db0b1: Status 404 returned error can't find the container with id 86768a8b72610851b6e26224864546d5e4b377e983ed4685cbea74a8648db0b1 Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.527988 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7df757454-fcwhf"] Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.906957 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.907953 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.985660 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" event={"ID":"3e8123ef-03c8-4e49-b631-d4d90f54c8d0","Type":"ContainerStarted","Data":"86768a8b72610851b6e26224864546d5e4b377e983ed4685cbea74a8648db0b1"} Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.989467 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7df757454-fcwhf" event={"ID":"35ebb9d5-af37-425c-b29b-c4f98eab213a","Type":"ContainerStarted","Data":"5c18094bccbcd00687eb027ec9281e601bb186bcb0eaf33db45550a960607c01"} Dec 01 09:28:52 crc kubenswrapper[4867]: I1201 09:28:52.989515 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-phzxd" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.146870 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-68bfcdf768-4dtj7"] Dec 01 09:28:53 crc kubenswrapper[4867]: E1201 09:28:53.147449 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb0c725b-663d-4764-b156-9426923ce046" containerName="placement-db-sync" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.147468 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb0c725b-663d-4764-b156-9426923ce046" containerName="placement-db-sync" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.147644 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb0c725b-663d-4764-b156-9426923ce046" containerName="placement-db-sync" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.148767 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.152013 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.152296 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.154854 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.154863 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.155006 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8pv9s" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.168882 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-68bfcdf768-4dtj7"] Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.204441 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a57f081c-e4b7-4dbb-a817-4d36052f3145-public-tls-certs\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.204593 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57f081c-e4b7-4dbb-a817-4d36052f3145-combined-ca-bundle\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.204942 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2v2z\" (UniqueName: \"kubernetes.io/projected/a57f081c-e4b7-4dbb-a817-4d36052f3145-kube-api-access-d2v2z\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.204987 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a57f081c-e4b7-4dbb-a817-4d36052f3145-scripts\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.205269 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a57f081c-e4b7-4dbb-a817-4d36052f3145-logs\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.205296 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a57f081c-e4b7-4dbb-a817-4d36052f3145-config-data\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.205314 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a57f081c-e4b7-4dbb-a817-4d36052f3145-internal-tls-certs\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.296179 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.296233 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.307284 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a57f081c-e4b7-4dbb-a817-4d36052f3145-config-data\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.307326 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a57f081c-e4b7-4dbb-a817-4d36052f3145-internal-tls-certs\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.307374 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57f081c-e4b7-4dbb-a817-4d36052f3145-combined-ca-bundle\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.307387 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a57f081c-e4b7-4dbb-a817-4d36052f3145-public-tls-certs\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.307450 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2v2z\" (UniqueName: \"kubernetes.io/projected/a57f081c-e4b7-4dbb-a817-4d36052f3145-kube-api-access-d2v2z\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.307489 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a57f081c-e4b7-4dbb-a817-4d36052f3145-scripts\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.307521 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a57f081c-e4b7-4dbb-a817-4d36052f3145-logs\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.307965 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a57f081c-e4b7-4dbb-a817-4d36052f3145-logs\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.326228 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a57f081c-e4b7-4dbb-a817-4d36052f3145-internal-tls-certs\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.327180 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a57f081c-e4b7-4dbb-a817-4d36052f3145-config-data\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.327226 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a57f081c-e4b7-4dbb-a817-4d36052f3145-public-tls-certs\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.329591 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a57f081c-e4b7-4dbb-a817-4d36052f3145-scripts\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.331401 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57f081c-e4b7-4dbb-a817-4d36052f3145-combined-ca-bundle\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.340192 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2v2z\" (UniqueName: \"kubernetes.io/projected/a57f081c-e4b7-4dbb-a817-4d36052f3145-kube-api-access-d2v2z\") pod \"placement-68bfcdf768-4dtj7\" (UID: \"a57f081c-e4b7-4dbb-a817-4d36052f3145\") " pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:53 crc kubenswrapper[4867]: I1201 09:28:53.468493 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.009697 4867 generic.go:334] "Generic (PLEG): container finished" podID="3e8123ef-03c8-4e49-b631-d4d90f54c8d0" containerID="337f0022c97a7421961b20b0f515b48fb2fc7f68bcfdc9bbbccf360285a5322f" exitCode=0 Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.010624 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" event={"ID":"3e8123ef-03c8-4e49-b631-d4d90f54c8d0","Type":"ContainerDied","Data":"337f0022c97a7421961b20b0f515b48fb2fc7f68bcfdc9bbbccf360285a5322f"} Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.283870 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-59b9c878df-5k6nq"] Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.285647 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.290871 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.300266 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.335930 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-59b9c878df-5k6nq"] Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.439514 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dea6dbd-f761-4336-b755-0a2c82f6c66b-public-tls-certs\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.439575 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dea6dbd-f761-4336-b755-0a2c82f6c66b-ovndb-tls-certs\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.439592 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7dea6dbd-f761-4336-b755-0a2c82f6c66b-config\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.439616 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dea6dbd-f761-4336-b755-0a2c82f6c66b-internal-tls-certs\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.439666 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dea6dbd-f761-4336-b755-0a2c82f6c66b-combined-ca-bundle\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.439686 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7dea6dbd-f761-4336-b755-0a2c82f6c66b-httpd-config\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.439715 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpmr6\" (UniqueName: \"kubernetes.io/projected/7dea6dbd-f761-4336-b755-0a2c82f6c66b-kube-api-access-kpmr6\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.540907 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dea6dbd-f761-4336-b755-0a2c82f6c66b-public-tls-certs\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.540990 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dea6dbd-f761-4336-b755-0a2c82f6c66b-ovndb-tls-certs\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.541016 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7dea6dbd-f761-4336-b755-0a2c82f6c66b-config\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.541047 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dea6dbd-f761-4336-b755-0a2c82f6c66b-internal-tls-certs\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.541095 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dea6dbd-f761-4336-b755-0a2c82f6c66b-combined-ca-bundle\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.541119 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7dea6dbd-f761-4336-b755-0a2c82f6c66b-httpd-config\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.541162 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpmr6\" (UniqueName: \"kubernetes.io/projected/7dea6dbd-f761-4336-b755-0a2c82f6c66b-kube-api-access-kpmr6\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.550598 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dea6dbd-f761-4336-b755-0a2c82f6c66b-internal-tls-certs\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.551375 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dea6dbd-f761-4336-b755-0a2c82f6c66b-ovndb-tls-certs\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.552634 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dea6dbd-f761-4336-b755-0a2c82f6c66b-combined-ca-bundle\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.554649 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7dea6dbd-f761-4336-b755-0a2c82f6c66b-config\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.561797 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7dea6dbd-f761-4336-b755-0a2c82f6c66b-httpd-config\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.565568 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpmr6\" (UniqueName: \"kubernetes.io/projected/7dea6dbd-f761-4336-b755-0a2c82f6c66b-kube-api-access-kpmr6\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.574052 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dea6dbd-f761-4336-b755-0a2c82f6c66b-public-tls-certs\") pod \"neutron-59b9c878df-5k6nq\" (UID: \"7dea6dbd-f761-4336-b755-0a2c82f6c66b\") " pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.589459 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.589683 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.617561 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.678136 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 09:28:54 crc kubenswrapper[4867]: I1201 09:28:54.726312 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 09:28:55 crc kubenswrapper[4867]: I1201 09:28:55.025524 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 09:28:55 crc kubenswrapper[4867]: I1201 09:28:55.025781 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 09:28:57 crc kubenswrapper[4867]: I1201 09:28:57.054838 4867 generic.go:334] "Generic (PLEG): container finished" podID="bdcd8107-dd0c-494b-b6ee-93fc8f3d6933" containerID="3dde5d99cc2da2d38dca5a3796b9183ab71d8574bd8d931f6f284fcbd5788fac" exitCode=0 Dec 01 09:28:57 crc kubenswrapper[4867]: I1201 09:28:57.055995 4867 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 09:28:57 crc kubenswrapper[4867]: I1201 09:28:57.056088 4867 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 09:28:57 crc kubenswrapper[4867]: I1201 09:28:57.055151 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dq766" event={"ID":"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933","Type":"ContainerDied","Data":"3dde5d99cc2da2d38dca5a3796b9183ab71d8574bd8d931f6f284fcbd5788fac"} Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.457594 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.458474 4867 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.532399 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.633110 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dq766" Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.807217 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-combined-ca-bundle\") pod \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.807787 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-credential-keys\") pod \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.807840 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-config-data\") pod \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.807868 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-scripts\") pod \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.807992 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-fernet-keys\") pod \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.808144 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km4gn\" (UniqueName: \"kubernetes.io/projected/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-kube-api-access-km4gn\") pod \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\" (UID: \"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933\") " Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.828301 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-kube-api-access-km4gn" (OuterVolumeSpecName: "kube-api-access-km4gn") pod "bdcd8107-dd0c-494b-b6ee-93fc8f3d6933" (UID: "bdcd8107-dd0c-494b-b6ee-93fc8f3d6933"). InnerVolumeSpecName "kube-api-access-km4gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.829382 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-scripts" (OuterVolumeSpecName: "scripts") pod "bdcd8107-dd0c-494b-b6ee-93fc8f3d6933" (UID: "bdcd8107-dd0c-494b-b6ee-93fc8f3d6933"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.829399 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bdcd8107-dd0c-494b-b6ee-93fc8f3d6933" (UID: "bdcd8107-dd0c-494b-b6ee-93fc8f3d6933"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.833295 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "bdcd8107-dd0c-494b-b6ee-93fc8f3d6933" (UID: "bdcd8107-dd0c-494b-b6ee-93fc8f3d6933"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.894847 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-config-data" (OuterVolumeSpecName: "config-data") pod "bdcd8107-dd0c-494b-b6ee-93fc8f3d6933" (UID: "bdcd8107-dd0c-494b-b6ee-93fc8f3d6933"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.899574 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdcd8107-dd0c-494b-b6ee-93fc8f3d6933" (UID: "bdcd8107-dd0c-494b-b6ee-93fc8f3d6933"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.908291 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-c846795f4-k7mlj" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.910537 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km4gn\" (UniqueName: \"kubernetes.io/projected/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-kube-api-access-km4gn\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.910580 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.910591 4867 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.910602 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.910615 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:02 crc kubenswrapper[4867]: I1201 09:29:02.910627 4867 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.121238 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dq766" event={"ID":"bdcd8107-dd0c-494b-b6ee-93fc8f3d6933","Type":"ContainerDied","Data":"2689b4f51f3570b22e3ba54bb1c31fe044bd20014b13226576740c54be47ecbc"} Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.121656 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2689b4f51f3570b22e3ba54bb1c31fe044bd20014b13226576740c54be47ecbc" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.121776 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dq766" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.124672 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7df757454-fcwhf" event={"ID":"35ebb9d5-af37-425c-b29b-c4f98eab213a","Type":"ContainerStarted","Data":"2da9838456e320ab2d731917161fedca6a8b98ca8c9cb79901fd72110515d5bf"} Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.294552 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-d47c7cb76-srf4p" podUID="8bd4fac2-df2c-4aab-bf00-99b54a83ddca" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.328764 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-68bfcdf768-4dtj7"] Dec 01 09:29:03 crc kubenswrapper[4867]: W1201 09:29:03.337555 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda57f081c_e4b7_4dbb_a817_4d36052f3145.slice/crio-b85b7ed3d1d10ebc1eb112f06cbfc598f84e204eff22f761c909784d9a5504b3 WatchSource:0}: Error finding container b85b7ed3d1d10ebc1eb112f06cbfc598f84e204eff22f761c909784d9a5504b3: Status 404 returned error can't find the container with id b85b7ed3d1d10ebc1eb112f06cbfc598f84e204eff22f761c909784d9a5504b3 Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.770636 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7b56ffdd7f-kp95s"] Dec 01 09:29:03 crc kubenswrapper[4867]: E1201 09:29:03.771305 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdcd8107-dd0c-494b-b6ee-93fc8f3d6933" containerName="keystone-bootstrap" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.771317 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdcd8107-dd0c-494b-b6ee-93fc8f3d6933" containerName="keystone-bootstrap" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.771515 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdcd8107-dd0c-494b-b6ee-93fc8f3d6933" containerName="keystone-bootstrap" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.772205 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.777700 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.777957 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.778026 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.778356 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.778496 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-h9gmt" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.778498 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.802014 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7b56ffdd7f-kp95s"] Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.946581 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-credential-keys\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.946621 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-config-data\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.946675 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-fernet-keys\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.946700 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-combined-ca-bundle\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.946795 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-internal-tls-certs\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.946890 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgprg\" (UniqueName: \"kubernetes.io/projected/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-kube-api-access-lgprg\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.946954 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-scripts\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:03 crc kubenswrapper[4867]: I1201 09:29:03.946978 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-public-tls-certs\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.028509 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-59b9c878df-5k6nq"] Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.049487 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-public-tls-certs\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.049565 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-credential-keys\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.049588 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-config-data\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.049627 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-fernet-keys\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.049649 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-combined-ca-bundle\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.049684 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-internal-tls-certs\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.049700 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgprg\" (UniqueName: \"kubernetes.io/projected/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-kube-api-access-lgprg\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.049758 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-scripts\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.058700 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-scripts\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.062169 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-config-data\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.062578 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-public-tls-certs\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.063685 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-internal-tls-certs\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.064268 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-credential-keys\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.064333 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-combined-ca-bundle\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.069676 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-fernet-keys\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.079207 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgprg\" (UniqueName: \"kubernetes.io/projected/fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0-kube-api-access-lgprg\") pod \"keystone-7b56ffdd7f-kp95s\" (UID: \"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0\") " pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.124330 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.190627 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7df757454-fcwhf" event={"ID":"35ebb9d5-af37-425c-b29b-c4f98eab213a","Type":"ContainerStarted","Data":"9bce89257fead3a13690ba81e640c75d7fce7e49bb50a9dd1dbc7cc2e4c7b7eb"} Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.191316 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.192599 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59b9c878df-5k6nq" event={"ID":"7dea6dbd-f761-4336-b755-0a2c82f6c66b","Type":"ContainerStarted","Data":"0cc845029d582a4ff1b699be80077725d02348cc79028de65a1c8252710acc14"} Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.207771 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" event={"ID":"3e8123ef-03c8-4e49-b631-d4d90f54c8d0","Type":"ContainerStarted","Data":"972971977dedf38a39bb58786ff5bf497a0a17d55c293f2e13921de6020e1309"} Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.208759 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.214697 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7df757454-fcwhf" podStartSLOduration=13.214684177 podStartE2EDuration="13.214684177s" podCreationTimestamp="2025-12-01 09:28:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:29:04.213138795 +0000 UTC m=+1265.672525559" watchObservedRunningTime="2025-12-01 09:29:04.214684177 +0000 UTC m=+1265.674070931" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.232146 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0509290-ed5f-4982-bae7-8710f1eeb88f","Type":"ContainerStarted","Data":"0964fd9990b16216ca47b3934f9501f15aa167e243b4ebd3de35d37e248c8151"} Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.233799 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-56qbp" event={"ID":"a891c34b-01dc-4e65-ad1d-b21597555988","Type":"ContainerStarted","Data":"e4f0fd8d89b9a3460463a367a8cb01bf7cfe3e0464b1931cd24446d8df94d90b"} Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.242141 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" podStartSLOduration=13.242120808 podStartE2EDuration="13.242120808s" podCreationTimestamp="2025-12-01 09:28:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:29:04.239719632 +0000 UTC m=+1265.699106386" watchObservedRunningTime="2025-12-01 09:29:04.242120808 +0000 UTC m=+1265.701507562" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.250215 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68bfcdf768-4dtj7" event={"ID":"a57f081c-e4b7-4dbb-a817-4d36052f3145","Type":"ContainerStarted","Data":"d7fa7874c9f2489f63dad90b9a61081cb52931a50ff1fbcef4249a79debc8ee9"} Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.250292 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68bfcdf768-4dtj7" event={"ID":"a57f081c-e4b7-4dbb-a817-4d36052f3145","Type":"ContainerStarted","Data":"b85b7ed3d1d10ebc1eb112f06cbfc598f84e204eff22f761c909784d9a5504b3"} Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.268435 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-56qbp" podStartSLOduration=3.412873412 podStartE2EDuration="1m2.268419119s" podCreationTimestamp="2025-12-01 09:28:02 +0000 UTC" firstStartedPulling="2025-12-01 09:28:04.43668873 +0000 UTC m=+1205.896075474" lastFinishedPulling="2025-12-01 09:29:03.292234427 +0000 UTC m=+1264.751621181" observedRunningTime="2025-12-01 09:29:04.262437184 +0000 UTC m=+1265.721823938" watchObservedRunningTime="2025-12-01 09:29:04.268419119 +0000 UTC m=+1265.727805873" Dec 01 09:29:04 crc kubenswrapper[4867]: I1201 09:29:04.747210 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7b56ffdd7f-kp95s"] Dec 01 09:29:05 crc kubenswrapper[4867]: I1201 09:29:05.260212 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b56ffdd7f-kp95s" event={"ID":"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0","Type":"ContainerStarted","Data":"7aa228e58d59c06c8aad53d179c80da196a5a17c5ee904b29b61d62800e16174"} Dec 01 09:29:05 crc kubenswrapper[4867]: I1201 09:29:05.260461 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:05 crc kubenswrapper[4867]: I1201 09:29:05.260473 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b56ffdd7f-kp95s" event={"ID":"fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0","Type":"ContainerStarted","Data":"4f4f06db192a1f2abc621b0eeaebdd94252c138ec9f75bbe172e3e7f7eb767bd"} Dec 01 09:29:05 crc kubenswrapper[4867]: I1201 09:29:05.262472 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59b9c878df-5k6nq" event={"ID":"7dea6dbd-f761-4336-b755-0a2c82f6c66b","Type":"ContainerStarted","Data":"4c35549387b90def596648693703ce3765b1547eb49d143d04e04edb0d28471f"} Dec 01 09:29:05 crc kubenswrapper[4867]: I1201 09:29:05.262494 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59b9c878df-5k6nq" event={"ID":"7dea6dbd-f761-4336-b755-0a2c82f6c66b","Type":"ContainerStarted","Data":"138642b1da6e856251f56e439173705d28ca022998e2080883429d04c5b42e3e"} Dec 01 09:29:05 crc kubenswrapper[4867]: I1201 09:29:05.262920 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:29:05 crc kubenswrapper[4867]: I1201 09:29:05.266518 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68bfcdf768-4dtj7" event={"ID":"a57f081c-e4b7-4dbb-a817-4d36052f3145","Type":"ContainerStarted","Data":"fbf9deb666cf5b8867fbabc9bca1e589cff4dffba2e17fa25a59173ee9fef9db"} Dec 01 09:29:05 crc kubenswrapper[4867]: I1201 09:29:05.267045 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:29:05 crc kubenswrapper[4867]: I1201 09:29:05.267071 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:29:05 crc kubenswrapper[4867]: I1201 09:29:05.269099 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xmtc6" event={"ID":"65b95ca9-4891-4e69-a789-a21549f94247","Type":"ContainerStarted","Data":"fc37c5defc43343438e3ca09a00cfa564140074fed6fe82681cda085e1796b7f"} Dec 01 09:29:05 crc kubenswrapper[4867]: I1201 09:29:05.307628 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7b56ffdd7f-kp95s" podStartSLOduration=2.307612235 podStartE2EDuration="2.307612235s" podCreationTimestamp="2025-12-01 09:29:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:29:05.302009132 +0000 UTC m=+1266.761395886" watchObservedRunningTime="2025-12-01 09:29:05.307612235 +0000 UTC m=+1266.766998989" Dec 01 09:29:05 crc kubenswrapper[4867]: I1201 09:29:05.367915 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-68bfcdf768-4dtj7" podStartSLOduration=12.367894615 podStartE2EDuration="12.367894615s" podCreationTimestamp="2025-12-01 09:28:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:29:05.333344039 +0000 UTC m=+1266.792730793" watchObservedRunningTime="2025-12-01 09:29:05.367894615 +0000 UTC m=+1266.827281369" Dec 01 09:29:05 crc kubenswrapper[4867]: I1201 09:29:05.371894 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-xmtc6" podStartSLOduration=4.524152081 podStartE2EDuration="1m3.371882094s" podCreationTimestamp="2025-12-01 09:28:02 +0000 UTC" firstStartedPulling="2025-12-01 09:28:04.449059069 +0000 UTC m=+1205.908445863" lastFinishedPulling="2025-12-01 09:29:03.296789122 +0000 UTC m=+1264.756175876" observedRunningTime="2025-12-01 09:29:05.366576449 +0000 UTC m=+1266.825963203" watchObservedRunningTime="2025-12-01 09:29:05.371882094 +0000 UTC m=+1266.831268848" Dec 01 09:29:05 crc kubenswrapper[4867]: I1201 09:29:05.404464 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-59b9c878df-5k6nq" podStartSLOduration=11.404446055 podStartE2EDuration="11.404446055s" podCreationTimestamp="2025-12-01 09:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:29:05.394341679 +0000 UTC m=+1266.853728433" watchObservedRunningTime="2025-12-01 09:29:05.404446055 +0000 UTC m=+1266.863832809" Dec 01 09:29:10 crc kubenswrapper[4867]: I1201 09:29:10.335092 4867 generic.go:334] "Generic (PLEG): container finished" podID="a891c34b-01dc-4e65-ad1d-b21597555988" containerID="e4f0fd8d89b9a3460463a367a8cb01bf7cfe3e0464b1931cd24446d8df94d90b" exitCode=0 Dec 01 09:29:10 crc kubenswrapper[4867]: I1201 09:29:10.335185 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-56qbp" event={"ID":"a891c34b-01dc-4e65-ad1d-b21597555988","Type":"ContainerDied","Data":"e4f0fd8d89b9a3460463a367a8cb01bf7cfe3e0464b1931cd24446d8df94d90b"} Dec 01 09:29:11 crc kubenswrapper[4867]: I1201 09:29:11.626006 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:29:11 crc kubenswrapper[4867]: I1201 09:29:11.674274 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-flkpw"] Dec 01 09:29:11 crc kubenswrapper[4867]: I1201 09:29:11.674495 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" podUID="5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714" containerName="dnsmasq-dns" containerID="cri-o://6e2cccfe484f54023a70759364d017117b21672280aee0dd1f596ffe67b29e55" gracePeriod=10 Dec 01 09:29:12 crc kubenswrapper[4867]: I1201 09:29:12.357343 4867 generic.go:334] "Generic (PLEG): container finished" podID="5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714" containerID="6e2cccfe484f54023a70759364d017117b21672280aee0dd1f596ffe67b29e55" exitCode=0 Dec 01 09:29:12 crc kubenswrapper[4867]: I1201 09:29:12.357390 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" event={"ID":"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714","Type":"ContainerDied","Data":"6e2cccfe484f54023a70759364d017117b21672280aee0dd1f596ffe67b29e55"} Dec 01 09:29:12 crc kubenswrapper[4867]: I1201 09:29:12.906228 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-c846795f4-k7mlj" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Dec 01 09:29:13 crc kubenswrapper[4867]: I1201 09:29:13.295002 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-d47c7cb76-srf4p" podUID="8bd4fac2-df2c-4aab-bf00-99b54a83ddca" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 01 09:29:13 crc kubenswrapper[4867]: I1201 09:29:13.548060 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" podUID="5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.140:5353: connect: connection refused" Dec 01 09:29:14 crc kubenswrapper[4867]: I1201 09:29:14.331853 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-56qbp" Dec 01 09:29:14 crc kubenswrapper[4867]: I1201 09:29:14.384422 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-56qbp" Dec 01 09:29:14 crc kubenswrapper[4867]: I1201 09:29:14.384528 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-56qbp" event={"ID":"a891c34b-01dc-4e65-ad1d-b21597555988","Type":"ContainerDied","Data":"90600535d61aeec92178fd9653eade62cdc9367914d5a4c25737c473d1e50ea7"} Dec 01 09:29:14 crc kubenswrapper[4867]: I1201 09:29:14.384572 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90600535d61aeec92178fd9653eade62cdc9367914d5a4c25737c473d1e50ea7" Dec 01 09:29:14 crc kubenswrapper[4867]: I1201 09:29:14.385247 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w979\" (UniqueName: \"kubernetes.io/projected/a891c34b-01dc-4e65-ad1d-b21597555988-kube-api-access-2w979\") pod \"a891c34b-01dc-4e65-ad1d-b21597555988\" (UID: \"a891c34b-01dc-4e65-ad1d-b21597555988\") " Dec 01 09:29:14 crc kubenswrapper[4867]: I1201 09:29:14.385322 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a891c34b-01dc-4e65-ad1d-b21597555988-db-sync-config-data\") pod \"a891c34b-01dc-4e65-ad1d-b21597555988\" (UID: \"a891c34b-01dc-4e65-ad1d-b21597555988\") " Dec 01 09:29:14 crc kubenswrapper[4867]: I1201 09:29:14.385650 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a891c34b-01dc-4e65-ad1d-b21597555988-combined-ca-bundle\") pod \"a891c34b-01dc-4e65-ad1d-b21597555988\" (UID: \"a891c34b-01dc-4e65-ad1d-b21597555988\") " Dec 01 09:29:14 crc kubenswrapper[4867]: I1201 09:29:14.393433 4867 generic.go:334] "Generic (PLEG): container finished" podID="65b95ca9-4891-4e69-a789-a21549f94247" containerID="fc37c5defc43343438e3ca09a00cfa564140074fed6fe82681cda085e1796b7f" exitCode=0 Dec 01 09:29:14 crc kubenswrapper[4867]: I1201 09:29:14.393685 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xmtc6" event={"ID":"65b95ca9-4891-4e69-a789-a21549f94247","Type":"ContainerDied","Data":"fc37c5defc43343438e3ca09a00cfa564140074fed6fe82681cda085e1796b7f"} Dec 01 09:29:14 crc kubenswrapper[4867]: I1201 09:29:14.398706 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a891c34b-01dc-4e65-ad1d-b21597555988-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a891c34b-01dc-4e65-ad1d-b21597555988" (UID: "a891c34b-01dc-4e65-ad1d-b21597555988"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:14 crc kubenswrapper[4867]: I1201 09:29:14.403061 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a891c34b-01dc-4e65-ad1d-b21597555988-kube-api-access-2w979" (OuterVolumeSpecName: "kube-api-access-2w979") pod "a891c34b-01dc-4e65-ad1d-b21597555988" (UID: "a891c34b-01dc-4e65-ad1d-b21597555988"). InnerVolumeSpecName "kube-api-access-2w979". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:29:14 crc kubenswrapper[4867]: I1201 09:29:14.442151 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a891c34b-01dc-4e65-ad1d-b21597555988-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a891c34b-01dc-4e65-ad1d-b21597555988" (UID: "a891c34b-01dc-4e65-ad1d-b21597555988"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:14 crc kubenswrapper[4867]: I1201 09:29:14.487574 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a891c34b-01dc-4e65-ad1d-b21597555988-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:14 crc kubenswrapper[4867]: I1201 09:29:14.487617 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w979\" (UniqueName: \"kubernetes.io/projected/a891c34b-01dc-4e65-ad1d-b21597555988-kube-api-access-2w979\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:14 crc kubenswrapper[4867]: I1201 09:29:14.487629 4867 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a891c34b-01dc-4e65-ad1d-b21597555988-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.731950 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-65fbb9cf75-989xz"] Dec 01 09:29:15 crc kubenswrapper[4867]: E1201 09:29:15.744496 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a891c34b-01dc-4e65-ad1d-b21597555988" containerName="barbican-db-sync" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.744523 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a891c34b-01dc-4e65-ad1d-b21597555988" containerName="barbican-db-sync" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.744711 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a891c34b-01dc-4e65-ad1d-b21597555988" containerName="barbican-db-sync" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.745586 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65fbb9cf75-989xz" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.751445 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-fqlhc" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.751617 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.768138 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.776889 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65fbb9cf75-989xz"] Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.818681 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8469f9a0-94d4-4c2c-839a-80d619a2d984-config-data-custom\") pod \"barbican-worker-65fbb9cf75-989xz\" (UID: \"8469f9a0-94d4-4c2c-839a-80d619a2d984\") " pod="openstack/barbican-worker-65fbb9cf75-989xz" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.818759 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8469f9a0-94d4-4c2c-839a-80d619a2d984-combined-ca-bundle\") pod \"barbican-worker-65fbb9cf75-989xz\" (UID: \"8469f9a0-94d4-4c2c-839a-80d619a2d984\") " pod="openstack/barbican-worker-65fbb9cf75-989xz" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.818797 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8469f9a0-94d4-4c2c-839a-80d619a2d984-config-data\") pod \"barbican-worker-65fbb9cf75-989xz\" (UID: \"8469f9a0-94d4-4c2c-839a-80d619a2d984\") " pod="openstack/barbican-worker-65fbb9cf75-989xz" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.818882 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8469f9a0-94d4-4c2c-839a-80d619a2d984-logs\") pod \"barbican-worker-65fbb9cf75-989xz\" (UID: \"8469f9a0-94d4-4c2c-839a-80d619a2d984\") " pod="openstack/barbican-worker-65fbb9cf75-989xz" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.818949 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5w6s\" (UniqueName: \"kubernetes.io/projected/8469f9a0-94d4-4c2c-839a-80d619a2d984-kube-api-access-g5w6s\") pod \"barbican-worker-65fbb9cf75-989xz\" (UID: \"8469f9a0-94d4-4c2c-839a-80d619a2d984\") " pod="openstack/barbican-worker-65fbb9cf75-989xz" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.850449 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-f9f6fdc98-l7cht"] Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.852547 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.857339 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.902965 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m9tm6"] Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.904794 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.920169 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5w6s\" (UniqueName: \"kubernetes.io/projected/8469f9a0-94d4-4c2c-839a-80d619a2d984-kube-api-access-g5w6s\") pod \"barbican-worker-65fbb9cf75-989xz\" (UID: \"8469f9a0-94d4-4c2c-839a-80d619a2d984\") " pod="openstack/barbican-worker-65fbb9cf75-989xz" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.920252 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8469f9a0-94d4-4c2c-839a-80d619a2d984-config-data-custom\") pod \"barbican-worker-65fbb9cf75-989xz\" (UID: \"8469f9a0-94d4-4c2c-839a-80d619a2d984\") " pod="openstack/barbican-worker-65fbb9cf75-989xz" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.920304 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2-combined-ca-bundle\") pod \"barbican-keystone-listener-f9f6fdc98-l7cht\" (UID: \"a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2\") " pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.920327 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8469f9a0-94d4-4c2c-839a-80d619a2d984-combined-ca-bundle\") pod \"barbican-worker-65fbb9cf75-989xz\" (UID: \"8469f9a0-94d4-4c2c-839a-80d619a2d984\") " pod="openstack/barbican-worker-65fbb9cf75-989xz" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.920372 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2-config-data-custom\") pod \"barbican-keystone-listener-f9f6fdc98-l7cht\" (UID: \"a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2\") " pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.920391 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8469f9a0-94d4-4c2c-839a-80d619a2d984-config-data\") pod \"barbican-worker-65fbb9cf75-989xz\" (UID: \"8469f9a0-94d4-4c2c-839a-80d619a2d984\") " pod="openstack/barbican-worker-65fbb9cf75-989xz" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.920431 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8469f9a0-94d4-4c2c-839a-80d619a2d984-logs\") pod \"barbican-worker-65fbb9cf75-989xz\" (UID: \"8469f9a0-94d4-4c2c-839a-80d619a2d984\") " pod="openstack/barbican-worker-65fbb9cf75-989xz" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.920485 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2-config-data\") pod \"barbican-keystone-listener-f9f6fdc98-l7cht\" (UID: \"a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2\") " pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.920505 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2-logs\") pod \"barbican-keystone-listener-f9f6fdc98-l7cht\" (UID: \"a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2\") " pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.920523 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jd4f\" (UniqueName: \"kubernetes.io/projected/a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2-kube-api-access-8jd4f\") pod \"barbican-keystone-listener-f9f6fdc98-l7cht\" (UID: \"a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2\") " pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.927860 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8469f9a0-94d4-4c2c-839a-80d619a2d984-logs\") pod \"barbican-worker-65fbb9cf75-989xz\" (UID: \"8469f9a0-94d4-4c2c-839a-80d619a2d984\") " pod="openstack/barbican-worker-65fbb9cf75-989xz" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.937985 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8469f9a0-94d4-4c2c-839a-80d619a2d984-combined-ca-bundle\") pod \"barbican-worker-65fbb9cf75-989xz\" (UID: \"8469f9a0-94d4-4c2c-839a-80d619a2d984\") " pod="openstack/barbican-worker-65fbb9cf75-989xz" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.943897 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-f9f6fdc98-l7cht"] Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.951000 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8469f9a0-94d4-4c2c-839a-80d619a2d984-config-data-custom\") pod \"barbican-worker-65fbb9cf75-989xz\" (UID: \"8469f9a0-94d4-4c2c-839a-80d619a2d984\") " pod="openstack/barbican-worker-65fbb9cf75-989xz" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.962480 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8469f9a0-94d4-4c2c-839a-80d619a2d984-config-data\") pod \"barbican-worker-65fbb9cf75-989xz\" (UID: \"8469f9a0-94d4-4c2c-839a-80d619a2d984\") " pod="openstack/barbican-worker-65fbb9cf75-989xz" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.963939 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5w6s\" (UniqueName: \"kubernetes.io/projected/8469f9a0-94d4-4c2c-839a-80d619a2d984-kube-api-access-g5w6s\") pod \"barbican-worker-65fbb9cf75-989xz\" (UID: \"8469f9a0-94d4-4c2c-839a-80d619a2d984\") " pod="openstack/barbican-worker-65fbb9cf75-989xz" Dec 01 09:29:15 crc kubenswrapper[4867]: I1201 09:29:15.964584 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m9tm6"] Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.032661 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-m9tm6\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.032980 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2-combined-ca-bundle\") pod \"barbican-keystone-listener-f9f6fdc98-l7cht\" (UID: \"a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2\") " pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.033114 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjgx9\" (UniqueName: \"kubernetes.io/projected/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-kube-api-access-zjgx9\") pod \"dnsmasq-dns-85ff748b95-m9tm6\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.033254 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-dns-svc\") pod \"dnsmasq-dns-85ff748b95-m9tm6\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.033359 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2-config-data-custom\") pod \"barbican-keystone-listener-f9f6fdc98-l7cht\" (UID: \"a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2\") " pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.034654 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-m9tm6\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.035445 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-config\") pod \"dnsmasq-dns-85ff748b95-m9tm6\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.035703 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-m9tm6\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.035971 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2-config-data\") pod \"barbican-keystone-listener-f9f6fdc98-l7cht\" (UID: \"a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2\") " pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.036235 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jd4f\" (UniqueName: \"kubernetes.io/projected/a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2-kube-api-access-8jd4f\") pod \"barbican-keystone-listener-f9f6fdc98-l7cht\" (UID: \"a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2\") " pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.040296 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2-logs\") pod \"barbican-keystone-listener-f9f6fdc98-l7cht\" (UID: \"a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2\") " pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.043306 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2-logs\") pod \"barbican-keystone-listener-f9f6fdc98-l7cht\" (UID: \"a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2\") " pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.060447 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jd4f\" (UniqueName: \"kubernetes.io/projected/a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2-kube-api-access-8jd4f\") pod \"barbican-keystone-listener-f9f6fdc98-l7cht\" (UID: \"a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2\") " pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.065244 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2-config-data\") pod \"barbican-keystone-listener-f9f6fdc98-l7cht\" (UID: \"a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2\") " pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.067003 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2-combined-ca-bundle\") pod \"barbican-keystone-listener-f9f6fdc98-l7cht\" (UID: \"a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2\") " pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.068792 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2-config-data-custom\") pod \"barbican-keystone-listener-f9f6fdc98-l7cht\" (UID: \"a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2\") " pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.098880 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-58d9c4988b-2fdgd"] Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.101752 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.113287 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65fbb9cf75-989xz" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.120242 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.131469 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58d9c4988b-2fdgd"] Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.146179 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-m9tm6\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.146273 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjgx9\" (UniqueName: \"kubernetes.io/projected/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-kube-api-access-zjgx9\") pod \"dnsmasq-dns-85ff748b95-m9tm6\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.146331 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-dns-svc\") pod \"dnsmasq-dns-85ff748b95-m9tm6\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.146389 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-m9tm6\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.146432 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-config\") pod \"dnsmasq-dns-85ff748b95-m9tm6\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.146470 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-m9tm6\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.147688 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-m9tm6\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.148983 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-m9tm6\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.149645 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-config\") pod \"dnsmasq-dns-85ff748b95-m9tm6\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.150491 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-m9tm6\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.152498 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-dns-svc\") pod \"dnsmasq-dns-85ff748b95-m9tm6\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.210010 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.249356 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmcl8\" (UniqueName: \"kubernetes.io/projected/f42bd489-b6c5-4f24-8da8-2b01860a71d2-kube-api-access-dmcl8\") pod \"barbican-api-58d9c4988b-2fdgd\" (UID: \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\") " pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.249429 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f42bd489-b6c5-4f24-8da8-2b01860a71d2-config-data-custom\") pod \"barbican-api-58d9c4988b-2fdgd\" (UID: \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\") " pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.249493 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f42bd489-b6c5-4f24-8da8-2b01860a71d2-combined-ca-bundle\") pod \"barbican-api-58d9c4988b-2fdgd\" (UID: \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\") " pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.249563 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f42bd489-b6c5-4f24-8da8-2b01860a71d2-logs\") pod \"barbican-api-58d9c4988b-2fdgd\" (UID: \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\") " pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.249672 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f42bd489-b6c5-4f24-8da8-2b01860a71d2-config-data\") pod \"barbican-api-58d9c4988b-2fdgd\" (UID: \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\") " pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.269469 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjgx9\" (UniqueName: \"kubernetes.io/projected/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-kube-api-access-zjgx9\") pod \"dnsmasq-dns-85ff748b95-m9tm6\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.352961 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f42bd489-b6c5-4f24-8da8-2b01860a71d2-config-data-custom\") pod \"barbican-api-58d9c4988b-2fdgd\" (UID: \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\") " pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.353063 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f42bd489-b6c5-4f24-8da8-2b01860a71d2-combined-ca-bundle\") pod \"barbican-api-58d9c4988b-2fdgd\" (UID: \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\") " pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.353138 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f42bd489-b6c5-4f24-8da8-2b01860a71d2-logs\") pod \"barbican-api-58d9c4988b-2fdgd\" (UID: \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\") " pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.353663 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f42bd489-b6c5-4f24-8da8-2b01860a71d2-logs\") pod \"barbican-api-58d9c4988b-2fdgd\" (UID: \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\") " pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.354269 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f42bd489-b6c5-4f24-8da8-2b01860a71d2-config-data\") pod \"barbican-api-58d9c4988b-2fdgd\" (UID: \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\") " pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.354360 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmcl8\" (UniqueName: \"kubernetes.io/projected/f42bd489-b6c5-4f24-8da8-2b01860a71d2-kube-api-access-dmcl8\") pod \"barbican-api-58d9c4988b-2fdgd\" (UID: \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\") " pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.380517 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f42bd489-b6c5-4f24-8da8-2b01860a71d2-combined-ca-bundle\") pod \"barbican-api-58d9c4988b-2fdgd\" (UID: \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\") " pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.380707 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f42bd489-b6c5-4f24-8da8-2b01860a71d2-config-data-custom\") pod \"barbican-api-58d9c4988b-2fdgd\" (UID: \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\") " pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.386834 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f42bd489-b6c5-4f24-8da8-2b01860a71d2-config-data\") pod \"barbican-api-58d9c4988b-2fdgd\" (UID: \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\") " pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.412744 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmcl8\" (UniqueName: \"kubernetes.io/projected/f42bd489-b6c5-4f24-8da8-2b01860a71d2-kube-api-access-dmcl8\") pod \"barbican-api-58d9c4988b-2fdgd\" (UID: \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\") " pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.464095 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:16 crc kubenswrapper[4867]: I1201 09:29:16.482213 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:17 crc kubenswrapper[4867]: I1201 09:29:17.864287 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:29:17 crc kubenswrapper[4867]: I1201 09:29:17.903415 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-db-sync-config-data\") pod \"65b95ca9-4891-4e69-a789-a21549f94247\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " Dec 01 09:29:17 crc kubenswrapper[4867]: I1201 09:29:17.903520 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-combined-ca-bundle\") pod \"65b95ca9-4891-4e69-a789-a21549f94247\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " Dec 01 09:29:17 crc kubenswrapper[4867]: I1201 09:29:17.903569 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65b95ca9-4891-4e69-a789-a21549f94247-etc-machine-id\") pod \"65b95ca9-4891-4e69-a789-a21549f94247\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " Dec 01 09:29:17 crc kubenswrapper[4867]: I1201 09:29:17.903633 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-scripts\") pod \"65b95ca9-4891-4e69-a789-a21549f94247\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " Dec 01 09:29:17 crc kubenswrapper[4867]: I1201 09:29:17.903702 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8jtw\" (UniqueName: \"kubernetes.io/projected/65b95ca9-4891-4e69-a789-a21549f94247-kube-api-access-n8jtw\") pod \"65b95ca9-4891-4e69-a789-a21549f94247\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " Dec 01 09:29:17 crc kubenswrapper[4867]: I1201 09:29:17.903750 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-config-data\") pod \"65b95ca9-4891-4e69-a789-a21549f94247\" (UID: \"65b95ca9-4891-4e69-a789-a21549f94247\") " Dec 01 09:29:17 crc kubenswrapper[4867]: I1201 09:29:17.904767 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65b95ca9-4891-4e69-a789-a21549f94247-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "65b95ca9-4891-4e69-a789-a21549f94247" (UID: "65b95ca9-4891-4e69-a789-a21549f94247"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:29:17 crc kubenswrapper[4867]: I1201 09:29:17.916162 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "65b95ca9-4891-4e69-a789-a21549f94247" (UID: "65b95ca9-4891-4e69-a789-a21549f94247"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:17 crc kubenswrapper[4867]: I1201 09:29:17.929073 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65b95ca9-4891-4e69-a789-a21549f94247-kube-api-access-n8jtw" (OuterVolumeSpecName: "kube-api-access-n8jtw") pod "65b95ca9-4891-4e69-a789-a21549f94247" (UID: "65b95ca9-4891-4e69-a789-a21549f94247"). InnerVolumeSpecName "kube-api-access-n8jtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:29:17 crc kubenswrapper[4867]: I1201 09:29:17.952958 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-scripts" (OuterVolumeSpecName: "scripts") pod "65b95ca9-4891-4e69-a789-a21549f94247" (UID: "65b95ca9-4891-4e69-a789-a21549f94247"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:17 crc kubenswrapper[4867]: I1201 09:29:17.956894 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.007334 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-ovsdbserver-nb\") pod \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.007882 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-dns-svc\") pod \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.008018 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-config\") pod \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.008278 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ddz2\" (UniqueName: \"kubernetes.io/projected/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-kube-api-access-4ddz2\") pod \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.008372 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-ovsdbserver-sb\") pod \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.008494 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-dns-swift-storage-0\") pod \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\" (UID: \"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714\") " Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.023378 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.023417 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8jtw\" (UniqueName: \"kubernetes.io/projected/65b95ca9-4891-4e69-a789-a21549f94247-kube-api-access-n8jtw\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.023435 4867 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.023448 4867 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65b95ca9-4891-4e69-a789-a21549f94247-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.025060 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-config-data" (OuterVolumeSpecName: "config-data") pod "65b95ca9-4891-4e69-a789-a21549f94247" (UID: "65b95ca9-4891-4e69-a789-a21549f94247"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.029414 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65b95ca9-4891-4e69-a789-a21549f94247" (UID: "65b95ca9-4891-4e69-a789-a21549f94247"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.036273 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-kube-api-access-4ddz2" (OuterVolumeSpecName: "kube-api-access-4ddz2") pod "5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714" (UID: "5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714"). InnerVolumeSpecName "kube-api-access-4ddz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.142936 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ddz2\" (UniqueName: \"kubernetes.io/projected/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-kube-api-access-4ddz2\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.142972 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.142989 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b95ca9-4891-4e69-a789-a21549f94247-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.303680 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714" (UID: "5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.334717 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714" (UID: "5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.355177 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714" (UID: "5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.373827 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-config" (OuterVolumeSpecName: "config") pod "5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714" (UID: "5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.374639 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.374654 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.374665 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.374676 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.404745 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714" (UID: "5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.478946 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:18 crc kubenswrapper[4867]: E1201 09:29:18.495334 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="e0509290-ed5f-4982-bae7-8710f1eeb88f" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.507129 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xmtc6" event={"ID":"65b95ca9-4891-4e69-a789-a21549f94247","Type":"ContainerDied","Data":"2762bd279ced79faaafbafa2f78f02e2ab2a6df0cbc4f5da18d6866d70e6d419"} Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.507178 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2762bd279ced79faaafbafa2f78f02e2ab2a6df0cbc4f5da18d6866d70e6d419" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.507249 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xmtc6" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.523148 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.523143 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-flkpw" event={"ID":"5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714","Type":"ContainerDied","Data":"c5f2da736fe042cca441f31b99ed3e62d9daa06d6218835002e6f3cb799bed72"} Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.523323 4867 scope.go:117] "RemoveContainer" containerID="6e2cccfe484f54023a70759364d017117b21672280aee0dd1f596ffe67b29e55" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.593039 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e0509290-ed5f-4982-bae7-8710f1eeb88f" containerName="ceilometer-notification-agent" containerID="cri-o://5594fca1ae40081291997daa714acc24d25714466ae4978c2e34205fd9c3d19e" gracePeriod=30 Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.593544 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.593793 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e0509290-ed5f-4982-bae7-8710f1eeb88f" containerName="proxy-httpd" containerID="cri-o://c494b44d480a3be9d4f55f1624169ffed89657dc35ce5eb48a15b3632531591b" gracePeriod=30 Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.593868 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e0509290-ed5f-4982-bae7-8710f1eeb88f" containerName="sg-core" containerID="cri-o://0964fd9990b16216ca47b3934f9501f15aa167e243b4ebd3de35d37e248c8151" gracePeriod=30 Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.595138 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-flkpw"] Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.596498 4867 scope.go:117] "RemoveContainer" containerID="e499648acc00bf09c624be59d58019ff8d070ffdcbd51ce42df9d55b760259e2" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.624853 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-flkpw"] Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.646336 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m9tm6"] Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.808031 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58d9c4988b-2fdgd"] Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.939691 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714" path="/var/lib/kubelet/pods/5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714/volumes" Dec 01 09:29:18 crc kubenswrapper[4867]: I1201 09:29:18.940895 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65fbb9cf75-989xz"] Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.049658 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-f9f6fdc98-l7cht"] Dec 01 09:29:19 crc kubenswrapper[4867]: W1201 09:29:19.071696 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1d8136b_aa0f_4cbe_b56a_6151d5ab8ce2.slice/crio-85c236411389daccb52c0d7ab14e0794ec474c3b9fa0bfacabeba744747bf15d WatchSource:0}: Error finding container 85c236411389daccb52c0d7ab14e0794ec474c3b9fa0bfacabeba744747bf15d: Status 404 returned error can't find the container with id 85c236411389daccb52c0d7ab14e0794ec474c3b9fa0bfacabeba744747bf15d Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.236886 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 09:29:19 crc kubenswrapper[4867]: E1201 09:29:19.237272 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714" containerName="init" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.237283 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714" containerName="init" Dec 01 09:29:19 crc kubenswrapper[4867]: E1201 09:29:19.237296 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714" containerName="dnsmasq-dns" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.237304 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714" containerName="dnsmasq-dns" Dec 01 09:29:19 crc kubenswrapper[4867]: E1201 09:29:19.237321 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b95ca9-4891-4e69-a789-a21549f94247" containerName="cinder-db-sync" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.237328 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b95ca9-4891-4e69-a789-a21549f94247" containerName="cinder-db-sync" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.237490 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="5261ec02-2e7a-4e2a-aeb9-d7ec0e1cf714" containerName="dnsmasq-dns" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.237721 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="65b95ca9-4891-4e69-a789-a21549f94247" containerName="cinder-db-sync" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.245218 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.257252 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.258342 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.258509 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5f4nz" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.262931 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.299534 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.370344 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4p8x\" (UniqueName: \"kubernetes.io/projected/031c6adf-727b-441f-b977-6feacc9a2c31-kube-api-access-v4p8x\") pod \"cinder-scheduler-0\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.370699 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/031c6adf-727b-441f-b977-6feacc9a2c31-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.370750 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.370806 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.370861 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-config-data\") pod \"cinder-scheduler-0\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.370895 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-scripts\") pod \"cinder-scheduler-0\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.377302 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m9tm6"] Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.474842 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.474926 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-config-data\") pod \"cinder-scheduler-0\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.474974 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-scripts\") pod \"cinder-scheduler-0\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.475002 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4p8x\" (UniqueName: \"kubernetes.io/projected/031c6adf-727b-441f-b977-6feacc9a2c31-kube-api-access-v4p8x\") pod \"cinder-scheduler-0\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.475045 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/031c6adf-727b-441f-b977-6feacc9a2c31-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.475102 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.476768 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/031c6adf-727b-441f-b977-6feacc9a2c31-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.489430 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.489510 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.491791 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-config-data\") pod \"cinder-scheduler-0\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.501959 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-cvdlf"] Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.510705 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.524327 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-scripts\") pod \"cinder-scheduler-0\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.559356 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4p8x\" (UniqueName: \"kubernetes.io/projected/031c6adf-727b-441f-b977-6feacc9a2c31-kube-api-access-v4p8x\") pod \"cinder-scheduler-0\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.588906 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.596030 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-cvdlf"] Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.679265 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-cvdlf\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.679395 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-cvdlf\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.679464 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fclg8\" (UniqueName: \"kubernetes.io/projected/795e69a3-9500-444d-8d6e-af50ede7c060-kube-api-access-fclg8\") pod \"dnsmasq-dns-5c9776ccc5-cvdlf\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.679592 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-cvdlf\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.679731 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-cvdlf\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.679772 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-config\") pod \"dnsmasq-dns-5c9776ccc5-cvdlf\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.690283 4867 generic.go:334] "Generic (PLEG): container finished" podID="e0509290-ed5f-4982-bae7-8710f1eeb88f" containerID="c494b44d480a3be9d4f55f1624169ffed89657dc35ce5eb48a15b3632531591b" exitCode=0 Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.690315 4867 generic.go:334] "Generic (PLEG): container finished" podID="e0509290-ed5f-4982-bae7-8710f1eeb88f" containerID="0964fd9990b16216ca47b3934f9501f15aa167e243b4ebd3de35d37e248c8151" exitCode=2 Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.690371 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0509290-ed5f-4982-bae7-8710f1eeb88f","Type":"ContainerDied","Data":"c494b44d480a3be9d4f55f1624169ffed89657dc35ce5eb48a15b3632531591b"} Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.690395 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0509290-ed5f-4982-bae7-8710f1eeb88f","Type":"ContainerDied","Data":"0964fd9990b16216ca47b3934f9501f15aa167e243b4ebd3de35d37e248c8151"} Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.705024 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" event={"ID":"a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2","Type":"ContainerStarted","Data":"85c236411389daccb52c0d7ab14e0794ec474c3b9fa0bfacabeba744747bf15d"} Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.725616 4867 generic.go:334] "Generic (PLEG): container finished" podID="d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0" containerID="fadd0380e2cf50ad59e8a5f26aca95476eff67922e92a8f625b13e4e3d5b873d" exitCode=0 Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.725685 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" event={"ID":"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0","Type":"ContainerDied","Data":"fadd0380e2cf50ad59e8a5f26aca95476eff67922e92a8f625b13e4e3d5b873d"} Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.725709 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" event={"ID":"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0","Type":"ContainerStarted","Data":"2eb24b75516fca84268dd27e1bb120be79d6a59adcf8f3a119025c0301d910be"} Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.734719 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65fbb9cf75-989xz" event={"ID":"8469f9a0-94d4-4c2c-839a-80d619a2d984","Type":"ContainerStarted","Data":"651a2f383406ce164b3a3de211a4962d0b6d4680c43ef8f205c807f7f283b444"} Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.742989 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58d9c4988b-2fdgd" event={"ID":"f42bd489-b6c5-4f24-8da8-2b01860a71d2","Type":"ContainerStarted","Data":"5c6315d1c4d92454d541c51b1f6ffce5eb2a5698612bce3c8c6adc1f134904c9"} Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.781474 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-cvdlf\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.781543 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-cvdlf\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.781576 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fclg8\" (UniqueName: \"kubernetes.io/projected/795e69a3-9500-444d-8d6e-af50ede7c060-kube-api-access-fclg8\") pod \"dnsmasq-dns-5c9776ccc5-cvdlf\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.781608 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-cvdlf\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.781663 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-cvdlf\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.781694 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-config\") pod \"dnsmasq-dns-5c9776ccc5-cvdlf\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.782616 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-config\") pod \"dnsmasq-dns-5c9776ccc5-cvdlf\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.783250 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-cvdlf\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.784355 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-cvdlf\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.785515 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-cvdlf\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.786786 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-cvdlf\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.827733 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.830136 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.855472 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.860262 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fclg8\" (UniqueName: \"kubernetes.io/projected/795e69a3-9500-444d-8d6e-af50ede7c060-kube-api-access-fclg8\") pod \"dnsmasq-dns-5c9776ccc5-cvdlf\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.945593 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 09:29:19 crc kubenswrapper[4867]: I1201 09:29:19.973758 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.001720 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-logs\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.001822 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.001844 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-config-data-custom\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.002258 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkckk\" (UniqueName: \"kubernetes.io/projected/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-kube-api-access-bkckk\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.002334 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.002387 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-scripts\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.002406 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-config-data\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.103692 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkckk\" (UniqueName: \"kubernetes.io/projected/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-kube-api-access-bkckk\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.103736 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.103761 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-scripts\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.103777 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-config-data\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.103862 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-logs\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.103910 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.103924 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-config-data-custom\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.106961 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-logs\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.107886 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.125892 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.133612 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-scripts\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.134360 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-config-data-custom\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.160993 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-config-data\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.199202 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkckk\" (UniqueName: \"kubernetes.io/projected/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-kube-api-access-bkckk\") pod \"cinder-api-0\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: E1201 09:29:20.412645 4867 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 01 09:29:20 crc kubenswrapper[4867]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 01 09:29:20 crc kubenswrapper[4867]: > podSandboxID="2eb24b75516fca84268dd27e1bb120be79d6a59adcf8f3a119025c0301d910be" Dec 01 09:29:20 crc kubenswrapper[4867]: E1201 09:29:20.413076 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 01 09:29:20 crc kubenswrapper[4867]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7ch57ch5c5hcch589hf7h577h659h96h5c8h5b4h55fhbbh667h565h5bchcbh58dh7dh5bch586h56ch574h598h67dh5c8h56dh8bh574h564hbch7q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zjgx9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-85ff748b95-m9tm6_openstack(d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 01 09:29:20 crc kubenswrapper[4867]: > logger="UnhandledError" Dec 01 09:29:20 crc kubenswrapper[4867]: E1201 09:29:20.415085 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" podUID="d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.455247 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.775303 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.802610 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58d9c4988b-2fdgd" event={"ID":"f42bd489-b6c5-4f24-8da8-2b01860a71d2","Type":"ContainerStarted","Data":"88f1c5bc5c9ed33faa44dd06325335269e8afa668886978d4cb67a4510f9a59c"} Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.802652 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.802663 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58d9c4988b-2fdgd" event={"ID":"f42bd489-b6c5-4f24-8da8-2b01860a71d2","Type":"ContainerStarted","Data":"3ed3f3b529ee396bf3fa49e37b14481e623bda8a5e55f93e6fb805af421a53be"} Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.803230 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.889500 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-58d9c4988b-2fdgd" podStartSLOduration=5.889479584 podStartE2EDuration="5.889479584s" podCreationTimestamp="2025-12-01 09:29:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:29:20.866263527 +0000 UTC m=+1282.325650291" watchObservedRunningTime="2025-12-01 09:29:20.889479584 +0000 UTC m=+1282.348866328" Dec 01 09:29:20 crc kubenswrapper[4867]: I1201 09:29:20.916179 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-cvdlf"] Dec 01 09:29:20 crc kubenswrapper[4867]: W1201 09:29:20.971306 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod795e69a3_9500_444d_8d6e_af50ede7c060.slice/crio-17646e1825c13641d883345ffd53716becf72db2c6103a6e7f4cda506a887cba WatchSource:0}: Error finding container 17646e1825c13641d883345ffd53716becf72db2c6103a6e7f4cda506a887cba: Status 404 returned error can't find the container with id 17646e1825c13641d883345ffd53716becf72db2c6103a6e7f4cda506a887cba Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.283616 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.464501 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.493431 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjgx9\" (UniqueName: \"kubernetes.io/projected/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-kube-api-access-zjgx9\") pod \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.493491 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-dns-svc\") pod \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.493564 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-config\") pod \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.501023 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-kube-api-access-zjgx9" (OuterVolumeSpecName: "kube-api-access-zjgx9") pod "d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0" (UID: "d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0"). InnerVolumeSpecName "kube-api-access-zjgx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.585402 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-config" (OuterVolumeSpecName: "config") pod "d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0" (UID: "d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.595484 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-ovsdbserver-sb\") pod \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.595569 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-dns-swift-storage-0\") pod \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.595591 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-ovsdbserver-nb\") pod \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.596191 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjgx9\" (UniqueName: \"kubernetes.io/projected/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-kube-api-access-zjgx9\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.596211 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.603964 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.604021 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.608451 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0" (UID: "d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.686381 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0" (UID: "d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.701195 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.701225 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:21 crc kubenswrapper[4867]: E1201 09:29:21.715492 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-ovsdbserver-nb podName:d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0 nodeName:}" failed. No retries permitted until 2025-12-01 09:29:22.215467424 +0000 UTC m=+1283.674854178 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovsdbserver-nb" (UniqueName: "kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-ovsdbserver-nb") pod "d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0" (UID: "d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0") : error deleting /var/lib/kubelet/pods/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0/volume-subpaths: remove /var/lib/kubelet/pods/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0/volume-subpaths: no such file or directory Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.715779 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0" (UID: "d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.805452 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.860221 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"031c6adf-727b-441f-b977-6feacc9a2c31","Type":"ContainerStarted","Data":"a40c91a76ecd87eb65f16a343b8bad5d889b0184197ceb233d7fd79c21072ec4"} Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.881032 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" event={"ID":"795e69a3-9500-444d-8d6e-af50ede7c060","Type":"ContainerStarted","Data":"5aa650b1b5b93fe95bb5bd4047a27e48a4c82478dcc0cb4ef20d9a4ee025cfe9"} Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.881084 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" event={"ID":"795e69a3-9500-444d-8d6e-af50ede7c060","Type":"ContainerStarted","Data":"17646e1825c13641d883345ffd53716becf72db2c6103a6e7f4cda506a887cba"} Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.885933 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" event={"ID":"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0","Type":"ContainerDied","Data":"2eb24b75516fca84268dd27e1bb120be79d6a59adcf8f3a119025c0301d910be"} Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.885986 4867 scope.go:117] "RemoveContainer" containerID="fadd0380e2cf50ad59e8a5f26aca95476eff67922e92a8f625b13e4e3d5b873d" Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.886121 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-m9tm6" Dec 01 09:29:21 crc kubenswrapper[4867]: I1201 09:29:21.918130 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9","Type":"ContainerStarted","Data":"58467cfa663b2f0ec2e3e63e50a424c248ed289cd0ed8958bd261c634726c966"} Dec 01 09:29:22 crc kubenswrapper[4867]: I1201 09:29:22.205400 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:29:22 crc kubenswrapper[4867]: I1201 09:29:22.315001 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-ovsdbserver-nb\") pod \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\" (UID: \"d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0\") " Dec 01 09:29:22 crc kubenswrapper[4867]: I1201 09:29:22.327708 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0" (UID: "d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:29:22 crc kubenswrapper[4867]: I1201 09:29:22.330318 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:22 crc kubenswrapper[4867]: I1201 09:29:22.569504 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m9tm6"] Dec 01 09:29:22 crc kubenswrapper[4867]: I1201 09:29:22.585266 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m9tm6"] Dec 01 09:29:22 crc kubenswrapper[4867]: I1201 09:29:22.836626 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0" path="/var/lib/kubelet/pods/d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0/volumes" Dec 01 09:29:22 crc kubenswrapper[4867]: I1201 09:29:22.906725 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-c846795f4-k7mlj" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Dec 01 09:29:22 crc kubenswrapper[4867]: I1201 09:29:22.906842 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:29:22 crc kubenswrapper[4867]: I1201 09:29:22.907696 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"686c5303f0412b7b582b8c491b3e8223fe86fdd2e4836a2991c0f50fae8a3067"} pod="openstack/horizon-c846795f4-k7mlj" containerMessage="Container horizon failed startup probe, will be restarted" Dec 01 09:29:22 crc kubenswrapper[4867]: I1201 09:29:22.907738 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-c846795f4-k7mlj" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" containerID="cri-o://686c5303f0412b7b582b8c491b3e8223fe86fdd2e4836a2991c0f50fae8a3067" gracePeriod=30 Dec 01 09:29:22 crc kubenswrapper[4867]: I1201 09:29:22.943160 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9","Type":"ContainerStarted","Data":"b9739e26604ff8379e6fc45511585a975efe1449e0e11a77b884989dfec9beff"} Dec 01 09:29:22 crc kubenswrapper[4867]: I1201 09:29:22.945439 4867 generic.go:334] "Generic (PLEG): container finished" podID="795e69a3-9500-444d-8d6e-af50ede7c060" containerID="5aa650b1b5b93fe95bb5bd4047a27e48a4c82478dcc0cb4ef20d9a4ee025cfe9" exitCode=0 Dec 01 09:29:22 crc kubenswrapper[4867]: I1201 09:29:22.946648 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" event={"ID":"795e69a3-9500-444d-8d6e-af50ede7c060","Type":"ContainerDied","Data":"5aa650b1b5b93fe95bb5bd4047a27e48a4c82478dcc0cb4ef20d9a4ee025cfe9"} Dec 01 09:29:22 crc kubenswrapper[4867]: I1201 09:29:22.972519 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 09:29:23 crc kubenswrapper[4867]: I1201 09:29:23.296268 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-d47c7cb76-srf4p" podUID="8bd4fac2-df2c-4aab-bf00-99b54a83ddca" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 01 09:29:23 crc kubenswrapper[4867]: I1201 09:29:23.296644 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:29:23 crc kubenswrapper[4867]: I1201 09:29:23.297417 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"9d03af7b1362790fa6ac6592121987809cbd79e15e394cba2fd458a1d0946120"} pod="openstack/horizon-d47c7cb76-srf4p" containerMessage="Container horizon failed startup probe, will be restarted" Dec 01 09:29:23 crc kubenswrapper[4867]: I1201 09:29:23.297450 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-d47c7cb76-srf4p" podUID="8bd4fac2-df2c-4aab-bf00-99b54a83ddca" containerName="horizon" containerID="cri-o://9d03af7b1362790fa6ac6592121987809cbd79e15e394cba2fd458a1d0946120" gracePeriod=30 Dec 01 09:29:23 crc kubenswrapper[4867]: I1201 09:29:23.976154 4867 generic.go:334] "Generic (PLEG): container finished" podID="e0509290-ed5f-4982-bae7-8710f1eeb88f" containerID="5594fca1ae40081291997daa714acc24d25714466ae4978c2e34205fd9c3d19e" exitCode=0 Dec 01 09:29:23 crc kubenswrapper[4867]: I1201 09:29:23.976358 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0509290-ed5f-4982-bae7-8710f1eeb88f","Type":"ContainerDied","Data":"5594fca1ae40081291997daa714acc24d25714466ae4978c2e34205fd9c3d19e"} Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.642281 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.654214 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7b97bc66cd-p4vv6"] Dec 01 09:29:24 crc kubenswrapper[4867]: E1201 09:29:24.654706 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0509290-ed5f-4982-bae7-8710f1eeb88f" containerName="ceilometer-notification-agent" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.654729 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0509290-ed5f-4982-bae7-8710f1eeb88f" containerName="ceilometer-notification-agent" Dec 01 09:29:24 crc kubenswrapper[4867]: E1201 09:29:24.654744 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0" containerName="init" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.654755 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0" containerName="init" Dec 01 09:29:24 crc kubenswrapper[4867]: E1201 09:29:24.654780 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0509290-ed5f-4982-bae7-8710f1eeb88f" containerName="proxy-httpd" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.654788 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0509290-ed5f-4982-bae7-8710f1eeb88f" containerName="proxy-httpd" Dec 01 09:29:24 crc kubenswrapper[4867]: E1201 09:29:24.654805 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0509290-ed5f-4982-bae7-8710f1eeb88f" containerName="sg-core" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.654828 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0509290-ed5f-4982-bae7-8710f1eeb88f" containerName="sg-core" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.655051 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0509290-ed5f-4982-bae7-8710f1eeb88f" containerName="ceilometer-notification-agent" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.655065 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4a40e9d-3eeb-482d-9c2b-c6fd4c54a4e0" containerName="init" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.655084 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0509290-ed5f-4982-bae7-8710f1eeb88f" containerName="sg-core" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.655105 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0509290-ed5f-4982-bae7-8710f1eeb88f" containerName="proxy-httpd" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.656291 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.661641 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.661779 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.698957 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b97bc66cd-p4vv6"] Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.700135 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-59b9c878df-5k6nq" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.702882 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0509290-ed5f-4982-bae7-8710f1eeb88f-log-httpd\") pod \"e0509290-ed5f-4982-bae7-8710f1eeb88f\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.703024 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-scripts\") pod \"e0509290-ed5f-4982-bae7-8710f1eeb88f\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.703168 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-sg-core-conf-yaml\") pod \"e0509290-ed5f-4982-bae7-8710f1eeb88f\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.703253 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-config-data\") pod \"e0509290-ed5f-4982-bae7-8710f1eeb88f\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.703324 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwqhj\" (UniqueName: \"kubernetes.io/projected/e0509290-ed5f-4982-bae7-8710f1eeb88f-kube-api-access-hwqhj\") pod \"e0509290-ed5f-4982-bae7-8710f1eeb88f\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.703364 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0509290-ed5f-4982-bae7-8710f1eeb88f-run-httpd\") pod \"e0509290-ed5f-4982-bae7-8710f1eeb88f\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.703442 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-combined-ca-bundle\") pod \"e0509290-ed5f-4982-bae7-8710f1eeb88f\" (UID: \"e0509290-ed5f-4982-bae7-8710f1eeb88f\") " Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.703945 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f948002f-f1df-40b5-8fcc-db28284c2609-combined-ca-bundle\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.704010 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f948002f-f1df-40b5-8fcc-db28284c2609-internal-tls-certs\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.704561 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f948002f-f1df-40b5-8fcc-db28284c2609-config-data\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.704594 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f948002f-f1df-40b5-8fcc-db28284c2609-public-tls-certs\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.704658 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f948002f-f1df-40b5-8fcc-db28284c2609-config-data-custom\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.704804 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zww9\" (UniqueName: \"kubernetes.io/projected/f948002f-f1df-40b5-8fcc-db28284c2609-kube-api-access-5zww9\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.704873 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f948002f-f1df-40b5-8fcc-db28284c2609-logs\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.705773 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0509290-ed5f-4982-bae7-8710f1eeb88f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e0509290-ed5f-4982-bae7-8710f1eeb88f" (UID: "e0509290-ed5f-4982-bae7-8710f1eeb88f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.710769 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0509290-ed5f-4982-bae7-8710f1eeb88f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e0509290-ed5f-4982-bae7-8710f1eeb88f" (UID: "e0509290-ed5f-4982-bae7-8710f1eeb88f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.744109 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-scripts" (OuterVolumeSpecName: "scripts") pod "e0509290-ed5f-4982-bae7-8710f1eeb88f" (UID: "e0509290-ed5f-4982-bae7-8710f1eeb88f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.756266 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.766102 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.783003 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0509290-ed5f-4982-bae7-8710f1eeb88f-kube-api-access-hwqhj" (OuterVolumeSpecName: "kube-api-access-hwqhj") pod "e0509290-ed5f-4982-bae7-8710f1eeb88f" (UID: "e0509290-ed5f-4982-bae7-8710f1eeb88f"). InnerVolumeSpecName "kube-api-access-hwqhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.816148 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f948002f-f1df-40b5-8fcc-db28284c2609-config-data-custom\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.816407 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zww9\" (UniqueName: \"kubernetes.io/projected/f948002f-f1df-40b5-8fcc-db28284c2609-kube-api-access-5zww9\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.816463 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f948002f-f1df-40b5-8fcc-db28284c2609-logs\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.816587 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f948002f-f1df-40b5-8fcc-db28284c2609-combined-ca-bundle\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.816908 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f948002f-f1df-40b5-8fcc-db28284c2609-internal-tls-certs\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.816931 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f948002f-f1df-40b5-8fcc-db28284c2609-config-data\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.816960 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f948002f-f1df-40b5-8fcc-db28284c2609-public-tls-certs\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.817043 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.817059 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwqhj\" (UniqueName: \"kubernetes.io/projected/e0509290-ed5f-4982-bae7-8710f1eeb88f-kube-api-access-hwqhj\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.817075 4867 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0509290-ed5f-4982-bae7-8710f1eeb88f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.817086 4867 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0509290-ed5f-4982-bae7-8710f1eeb88f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.822038 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f948002f-f1df-40b5-8fcc-db28284c2609-logs\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.842128 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f948002f-f1df-40b5-8fcc-db28284c2609-config-data\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.882708 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f948002f-f1df-40b5-8fcc-db28284c2609-config-data-custom\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.889220 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f948002f-f1df-40b5-8fcc-db28284c2609-internal-tls-certs\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.889603 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f948002f-f1df-40b5-8fcc-db28284c2609-public-tls-certs\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.890243 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zww9\" (UniqueName: \"kubernetes.io/projected/f948002f-f1df-40b5-8fcc-db28284c2609-kube-api-access-5zww9\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.915232 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7df757454-fcwhf"] Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.915518 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7df757454-fcwhf" podUID="35ebb9d5-af37-425c-b29b-c4f98eab213a" containerName="neutron-api" containerID="cri-o://2da9838456e320ab2d731917161fedca6a8b98ca8c9cb79901fd72110515d5bf" gracePeriod=30 Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.915692 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7df757454-fcwhf" podUID="35ebb9d5-af37-425c-b29b-c4f98eab213a" containerName="neutron-httpd" containerID="cri-o://9bce89257fead3a13690ba81e640c75d7fce7e49bb50a9dd1dbc7cc2e4c7b7eb" gracePeriod=30 Dec 01 09:29:24 crc kubenswrapper[4867]: I1201 09:29:24.924114 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f948002f-f1df-40b5-8fcc-db28284c2609-combined-ca-bundle\") pod \"barbican-api-7b97bc66cd-p4vv6\" (UID: \"f948002f-f1df-40b5-8fcc-db28284c2609\") " pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.005968 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e0509290-ed5f-4982-bae7-8710f1eeb88f" (UID: "e0509290-ed5f-4982-bae7-8710f1eeb88f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.034552 4867 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.063329 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.063702 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0509290-ed5f-4982-bae7-8710f1eeb88f","Type":"ContainerDied","Data":"4e5cba66c72b3746a5204d32971df880a01c6841f8a84fcc0507a3ba4d3ded53"} Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.064378 4867 scope.go:117] "RemoveContainer" containerID="c494b44d480a3be9d4f55f1624169ffed89657dc35ce5eb48a15b3632531591b" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.065101 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0509290-ed5f-4982-bae7-8710f1eeb88f" (UID: "e0509290-ed5f-4982-bae7-8710f1eeb88f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.136253 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.147760 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.154974 4867 scope.go:117] "RemoveContainer" containerID="0964fd9990b16216ca47b3934f9501f15aa167e243b4ebd3de35d37e248c8151" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.268744 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-config-data" (OuterVolumeSpecName: "config-data") pod "e0509290-ed5f-4982-bae7-8710f1eeb88f" (UID: "e0509290-ed5f-4982-bae7-8710f1eeb88f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.348921 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0509290-ed5f-4982-bae7-8710f1eeb88f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.350410 4867 scope.go:117] "RemoveContainer" containerID="5594fca1ae40081291997daa714acc24d25714466ae4978c2e34205fd9c3d19e" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.667349 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.680568 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.698719 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.702191 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.711315 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.711511 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.771867 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/318a4560-9e79-46fe-96bf-aaa534848b45-log-httpd\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.771943 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-config-data\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.771966 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8bdt\" (UniqueName: \"kubernetes.io/projected/318a4560-9e79-46fe-96bf-aaa534848b45-kube-api-access-j8bdt\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.771983 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-scripts\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.772006 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.772022 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.772041 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/318a4560-9e79-46fe-96bf-aaa534848b45-run-httpd\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.784328 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.879397 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/318a4560-9e79-46fe-96bf-aaa534848b45-log-httpd\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.879624 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-config-data\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.879645 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8bdt\" (UniqueName: \"kubernetes.io/projected/318a4560-9e79-46fe-96bf-aaa534848b45-kube-api-access-j8bdt\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.879665 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-scripts\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.879688 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.879705 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.879722 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/318a4560-9e79-46fe-96bf-aaa534848b45-run-httpd\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.891271 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/318a4560-9e79-46fe-96bf-aaa534848b45-log-httpd\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.891503 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/318a4560-9e79-46fe-96bf-aaa534848b45-run-httpd\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.892327 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-scripts\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.903032 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-config-data\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.906505 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.915424 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:25 crc kubenswrapper[4867]: I1201 09:29:25.916169 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8bdt\" (UniqueName: \"kubernetes.io/projected/318a4560-9e79-46fe-96bf-aaa534848b45-kube-api-access-j8bdt\") pod \"ceilometer-0\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " pod="openstack/ceilometer-0" Dec 01 09:29:26 crc kubenswrapper[4867]: I1201 09:29:26.068443 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:29:26 crc kubenswrapper[4867]: I1201 09:29:26.192969 4867 generic.go:334] "Generic (PLEG): container finished" podID="35ebb9d5-af37-425c-b29b-c4f98eab213a" containerID="9bce89257fead3a13690ba81e640c75d7fce7e49bb50a9dd1dbc7cc2e4c7b7eb" exitCode=0 Dec 01 09:29:26 crc kubenswrapper[4867]: I1201 09:29:26.193047 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7df757454-fcwhf" event={"ID":"35ebb9d5-af37-425c-b29b-c4f98eab213a","Type":"ContainerDied","Data":"9bce89257fead3a13690ba81e640c75d7fce7e49bb50a9dd1dbc7cc2e4c7b7eb"} Dec 01 09:29:26 crc kubenswrapper[4867]: I1201 09:29:26.193078 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b97bc66cd-p4vv6"] Dec 01 09:29:26 crc kubenswrapper[4867]: I1201 09:29:26.218965 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65fbb9cf75-989xz" event={"ID":"8469f9a0-94d4-4c2c-839a-80d619a2d984","Type":"ContainerStarted","Data":"1a71e2bfab2ff0093e40895cb5070c1608dcc77fe49413d14c5b6a2725e0c283"} Dec 01 09:29:26 crc kubenswrapper[4867]: I1201 09:29:26.309073 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" event={"ID":"795e69a3-9500-444d-8d6e-af50ede7c060","Type":"ContainerStarted","Data":"6a8e1b17c2b8ffea65982c5b03bd3eb663e08f3f4a7c08352d8ed2645d0382b0"} Dec 01 09:29:26 crc kubenswrapper[4867]: I1201 09:29:26.309901 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:26 crc kubenswrapper[4867]: I1201 09:29:26.341520 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" event={"ID":"a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2","Type":"ContainerStarted","Data":"a2af2e16049de245800f6c40fd5bf82d5f2728c046d908f54b579eba2bbf8757"} Dec 01 09:29:26 crc kubenswrapper[4867]: I1201 09:29:26.368589 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" podStartSLOduration=7.368573185 podStartE2EDuration="7.368573185s" podCreationTimestamp="2025-12-01 09:29:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:29:26.367061783 +0000 UTC m=+1287.826448547" watchObservedRunningTime="2025-12-01 09:29:26.368573185 +0000 UTC m=+1287.827959939" Dec 01 09:29:26 crc kubenswrapper[4867]: I1201 09:29:26.842854 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0509290-ed5f-4982-bae7-8710f1eeb88f" path="/var/lib/kubelet/pods/e0509290-ed5f-4982-bae7-8710f1eeb88f/volumes" Dec 01 09:29:27 crc kubenswrapper[4867]: I1201 09:29:27.076206 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:29:27 crc kubenswrapper[4867]: I1201 09:29:27.398709 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"031c6adf-727b-441f-b977-6feacc9a2c31","Type":"ContainerStarted","Data":"aee1b40aa22d0bdc5c7ff47b30cfb5c67fc10f14fe4b04998cc5977244622b65"} Dec 01 09:29:27 crc kubenswrapper[4867]: I1201 09:29:27.429068 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"318a4560-9e79-46fe-96bf-aaa534848b45","Type":"ContainerStarted","Data":"22ea4e93b15affb07f1afa78412ecddc47450ea74f5f616f67a4f1e5e0f8f5c7"} Dec 01 09:29:27 crc kubenswrapper[4867]: I1201 09:29:27.442328 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65fbb9cf75-989xz" event={"ID":"8469f9a0-94d4-4c2c-839a-80d619a2d984","Type":"ContainerStarted","Data":"756ff49b723c969eefe15033495378b7e8507a0ceb49c6af94b6f61ee9c7949d"} Dec 01 09:29:27 crc kubenswrapper[4867]: I1201 09:29:27.454002 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b97bc66cd-p4vv6" event={"ID":"f948002f-f1df-40b5-8fcc-db28284c2609","Type":"ContainerStarted","Data":"46d281591ab33e7328f3f24d15d5440ce868c8241d25a32e67de344569f87254"} Dec 01 09:29:27 crc kubenswrapper[4867]: I1201 09:29:27.454054 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b97bc66cd-p4vv6" event={"ID":"f948002f-f1df-40b5-8fcc-db28284c2609","Type":"ContainerStarted","Data":"e3e6502d60ecf480a55a63b9fc3a8466e8c88d12890d2fe34ebc44a3ca32f304"} Dec 01 09:29:27 crc kubenswrapper[4867]: I1201 09:29:27.454529 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:27 crc kubenswrapper[4867]: I1201 09:29:27.454981 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:27 crc kubenswrapper[4867]: I1201 09:29:27.460663 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" event={"ID":"a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2","Type":"ContainerStarted","Data":"7e8e16141d6f49241d872481382484ec26f264e70adb86b99c3158ef008bdb1f"} Dec 01 09:29:27 crc kubenswrapper[4867]: I1201 09:29:27.468524 4867 generic.go:334] "Generic (PLEG): container finished" podID="35ebb9d5-af37-425c-b29b-c4f98eab213a" containerID="2da9838456e320ab2d731917161fedca6a8b98ca8c9cb79901fd72110515d5bf" exitCode=0 Dec 01 09:29:27 crc kubenswrapper[4867]: I1201 09:29:27.468634 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7df757454-fcwhf" event={"ID":"35ebb9d5-af37-425c-b29b-c4f98eab213a","Type":"ContainerDied","Data":"2da9838456e320ab2d731917161fedca6a8b98ca8c9cb79901fd72110515d5bf"} Dec 01 09:29:27 crc kubenswrapper[4867]: I1201 09:29:27.474884 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-65fbb9cf75-989xz" podStartSLOduration=6.881111509 podStartE2EDuration="12.474850138s" podCreationTimestamp="2025-12-01 09:29:15 +0000 UTC" firstStartedPulling="2025-12-01 09:29:18.893323642 +0000 UTC m=+1280.352710396" lastFinishedPulling="2025-12-01 09:29:24.487062271 +0000 UTC m=+1285.946449025" observedRunningTime="2025-12-01 09:29:27.472314528 +0000 UTC m=+1288.931701282" watchObservedRunningTime="2025-12-01 09:29:27.474850138 +0000 UTC m=+1288.934236892" Dec 01 09:29:27 crc kubenswrapper[4867]: I1201 09:29:27.511226 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7b97bc66cd-p4vv6" podStartSLOduration=3.511198942 podStartE2EDuration="3.511198942s" podCreationTimestamp="2025-12-01 09:29:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:29:27.494269698 +0000 UTC m=+1288.953656452" watchObservedRunningTime="2025-12-01 09:29:27.511198942 +0000 UTC m=+1288.970585696" Dec 01 09:29:27 crc kubenswrapper[4867]: I1201 09:29:27.517439 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9" containerName="cinder-api-log" containerID="cri-o://b9739e26604ff8379e6fc45511585a975efe1449e0e11a77b884989dfec9beff" gracePeriod=30 Dec 01 09:29:27 crc kubenswrapper[4867]: I1201 09:29:27.517749 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9","Type":"ContainerStarted","Data":"69284b2ffad162d592ee84c6d54dc5e9766834e513c75ad34226ea8165ba895c"} Dec 01 09:29:27 crc kubenswrapper[4867]: I1201 09:29:27.517800 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 01 09:29:27 crc kubenswrapper[4867]: I1201 09:29:27.517860 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9" containerName="cinder-api" containerID="cri-o://69284b2ffad162d592ee84c6d54dc5e9766834e513c75ad34226ea8165ba895c" gracePeriod=30 Dec 01 09:29:27 crc kubenswrapper[4867]: I1201 09:29:27.525879 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-f9f6fdc98-l7cht" podStartSLOduration=7.127402491 podStartE2EDuration="12.525856384s" podCreationTimestamp="2025-12-01 09:29:15 +0000 UTC" firstStartedPulling="2025-12-01 09:29:19.086427238 +0000 UTC m=+1280.545813992" lastFinishedPulling="2025-12-01 09:29:24.484881121 +0000 UTC m=+1285.944267885" observedRunningTime="2025-12-01 09:29:27.522884402 +0000 UTC m=+1288.982271176" watchObservedRunningTime="2025-12-01 09:29:27.525856384 +0000 UTC m=+1288.985243148" Dec 01 09:29:27 crc kubenswrapper[4867]: I1201 09:29:27.594255 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=8.594233525 podStartE2EDuration="8.594233525s" podCreationTimestamp="2025-12-01 09:29:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:29:27.56773617 +0000 UTC m=+1289.027122924" watchObservedRunningTime="2025-12-01 09:29:27.594233525 +0000 UTC m=+1289.053620279" Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.015021 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.075330 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mctk\" (UniqueName: \"kubernetes.io/projected/35ebb9d5-af37-425c-b29b-c4f98eab213a-kube-api-access-5mctk\") pod \"35ebb9d5-af37-425c-b29b-c4f98eab213a\" (UID: \"35ebb9d5-af37-425c-b29b-c4f98eab213a\") " Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.075391 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-config\") pod \"35ebb9d5-af37-425c-b29b-c4f98eab213a\" (UID: \"35ebb9d5-af37-425c-b29b-c4f98eab213a\") " Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.075478 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-ovndb-tls-certs\") pod \"35ebb9d5-af37-425c-b29b-c4f98eab213a\" (UID: \"35ebb9d5-af37-425c-b29b-c4f98eab213a\") " Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.075512 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-httpd-config\") pod \"35ebb9d5-af37-425c-b29b-c4f98eab213a\" (UID: \"35ebb9d5-af37-425c-b29b-c4f98eab213a\") " Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.075606 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-combined-ca-bundle\") pod \"35ebb9d5-af37-425c-b29b-c4f98eab213a\" (UID: \"35ebb9d5-af37-425c-b29b-c4f98eab213a\") " Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.107040 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35ebb9d5-af37-425c-b29b-c4f98eab213a-kube-api-access-5mctk" (OuterVolumeSpecName: "kube-api-access-5mctk") pod "35ebb9d5-af37-425c-b29b-c4f98eab213a" (UID: "35ebb9d5-af37-425c-b29b-c4f98eab213a"). InnerVolumeSpecName "kube-api-access-5mctk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.180645 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mctk\" (UniqueName: \"kubernetes.io/projected/35ebb9d5-af37-425c-b29b-c4f98eab213a-kube-api-access-5mctk\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.187656 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "35ebb9d5-af37-425c-b29b-c4f98eab213a" (UID: "35ebb9d5-af37-425c-b29b-c4f98eab213a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.285964 4867 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.470636 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.474277 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-68bfcdf768-4dtj7" Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.614022 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7df757454-fcwhf" Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.614050 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7df757454-fcwhf" event={"ID":"35ebb9d5-af37-425c-b29b-c4f98eab213a","Type":"ContainerDied","Data":"5c18094bccbcd00687eb027ec9281e601bb186bcb0eaf33db45550a960607c01"} Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.616600 4867 scope.go:117] "RemoveContainer" containerID="9bce89257fead3a13690ba81e640c75d7fce7e49bb50a9dd1dbc7cc2e4c7b7eb" Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.649973 4867 generic.go:334] "Generic (PLEG): container finished" podID="aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9" containerID="69284b2ffad162d592ee84c6d54dc5e9766834e513c75ad34226ea8165ba895c" exitCode=0 Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.650010 4867 generic.go:334] "Generic (PLEG): container finished" podID="aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9" containerID="b9739e26604ff8379e6fc45511585a975efe1449e0e11a77b884989dfec9beff" exitCode=143 Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.650055 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9","Type":"ContainerDied","Data":"69284b2ffad162d592ee84c6d54dc5e9766834e513c75ad34226ea8165ba895c"} Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.650080 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9","Type":"ContainerDied","Data":"b9739e26604ff8379e6fc45511585a975efe1449e0e11a77b884989dfec9beff"} Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.674359 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35ebb9d5-af37-425c-b29b-c4f98eab213a" (UID: "35ebb9d5-af37-425c-b29b-c4f98eab213a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.700044 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"031c6adf-727b-441f-b977-6feacc9a2c31","Type":"ContainerStarted","Data":"458329e1da247da42e33f9a12e9a03ca8f903900cf7d44d2005d065cfb4813ce"} Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.725595 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.744933 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b97bc66cd-p4vv6" event={"ID":"f948002f-f1df-40b5-8fcc-db28284c2609","Type":"ContainerStarted","Data":"edffb40494460b18510e686aa60d03c426771dc19419e624552942f342e25689"} Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.757647 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.066133302 podStartE2EDuration="9.757629371s" podCreationTimestamp="2025-12-01 09:29:19 +0000 UTC" firstStartedPulling="2025-12-01 09:29:20.791554373 +0000 UTC m=+1282.250941127" lastFinishedPulling="2025-12-01 09:29:24.483050442 +0000 UTC m=+1285.942437196" observedRunningTime="2025-12-01 09:29:28.744908963 +0000 UTC m=+1290.204295717" watchObservedRunningTime="2025-12-01 09:29:28.757629371 +0000 UTC m=+1290.217016125" Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.786397 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-config" (OuterVolumeSpecName: "config") pod "35ebb9d5-af37-425c-b29b-c4f98eab213a" (UID: "35ebb9d5-af37-425c-b29b-c4f98eab213a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:28 crc kubenswrapper[4867]: I1201 09:29:28.844182 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.020960 4867 scope.go:117] "RemoveContainer" containerID="2da9838456e320ab2d731917161fedca6a8b98ca8c9cb79901fd72110515d5bf" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.031499 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "35ebb9d5-af37-425c-b29b-c4f98eab213a" (UID: "35ebb9d5-af37-425c-b29b-c4f98eab213a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.066047 4867 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/35ebb9d5-af37-425c-b29b-c4f98eab213a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.167457 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.275687 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-config-data\") pod \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.275762 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-scripts\") pod \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.275904 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-logs\") pod \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.275932 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkckk\" (UniqueName: \"kubernetes.io/projected/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-kube-api-access-bkckk\") pod \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.276027 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-etc-machine-id\") pod \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.276053 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-config-data-custom\") pod \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.276191 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-combined-ca-bundle\") pod \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\" (UID: \"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9\") " Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.284317 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-logs" (OuterVolumeSpecName: "logs") pod "aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9" (UID: "aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.286516 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9" (UID: "aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.298507 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9" (UID: "aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.300888 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7df757454-fcwhf"] Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.305606 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-scripts" (OuterVolumeSpecName: "scripts") pod "aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9" (UID: "aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.312282 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7df757454-fcwhf"] Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.328642 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-kube-api-access-bkckk" (OuterVolumeSpecName: "kube-api-access-bkckk") pod "aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9" (UID: "aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9"). InnerVolumeSpecName "kube-api-access-bkckk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.336485 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9" (UID: "aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.383000 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.383037 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.383046 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.383055 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkckk\" (UniqueName: \"kubernetes.io/projected/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-kube-api-access-bkckk\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.383065 4867 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.383072 4867 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.506365 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-config-data" (OuterVolumeSpecName: "config-data") pod "aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9" (UID: "aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.588263 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.589266 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.761222 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"318a4560-9e79-46fe-96bf-aaa534848b45","Type":"ContainerStarted","Data":"955ae7214111cfad62bad5b8d44104f863a69864fcd2ddd389e921e3b8c7e03f"} Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.776170 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.777719 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9","Type":"ContainerDied","Data":"58467cfa663b2f0ec2e3e63e50a424c248ed289cd0ed8958bd261c634726c966"} Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.798221 4867 scope.go:117] "RemoveContainer" containerID="69284b2ffad162d592ee84c6d54dc5e9766834e513c75ad34226ea8165ba895c" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.851589 4867 scope.go:117] "RemoveContainer" containerID="b9739e26604ff8379e6fc45511585a975efe1449e0e11a77b884989dfec9beff" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.866696 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.886950 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.988063 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 01 09:29:29 crc kubenswrapper[4867]: E1201 09:29:29.988878 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9" containerName="cinder-api" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.988914 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9" containerName="cinder-api" Dec 01 09:29:29 crc kubenswrapper[4867]: E1201 09:29:29.988937 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ebb9d5-af37-425c-b29b-c4f98eab213a" containerName="neutron-api" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.988944 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ebb9d5-af37-425c-b29b-c4f98eab213a" containerName="neutron-api" Dec 01 09:29:29 crc kubenswrapper[4867]: E1201 09:29:29.988956 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9" containerName="cinder-api-log" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.988987 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9" containerName="cinder-api-log" Dec 01 09:29:29 crc kubenswrapper[4867]: E1201 09:29:29.989008 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ebb9d5-af37-425c-b29b-c4f98eab213a" containerName="neutron-httpd" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.989015 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ebb9d5-af37-425c-b29b-c4f98eab213a" containerName="neutron-httpd" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.989303 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="35ebb9d5-af37-425c-b29b-c4f98eab213a" containerName="neutron-api" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.989330 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9" containerName="cinder-api" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.989342 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9" containerName="cinder-api-log" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.989384 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="35ebb9d5-af37-425c-b29b-c4f98eab213a" containerName="neutron-httpd" Dec 01 09:29:29 crc kubenswrapper[4867]: I1201 09:29:29.990846 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.023181 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.023862 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.023986 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.027714 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.115948 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71e5b77-e090-4fdd-a254-387c5f9c5fba-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.116036 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9wq8\" (UniqueName: \"kubernetes.io/projected/c71e5b77-e090-4fdd-a254-387c5f9c5fba-kube-api-access-v9wq8\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.116065 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c71e5b77-e090-4fdd-a254-387c5f9c5fba-config-data\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.116140 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71e5b77-e090-4fdd-a254-387c5f9c5fba-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.116171 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c71e5b77-e090-4fdd-a254-387c5f9c5fba-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.116205 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71e5b77-e090-4fdd-a254-387c5f9c5fba-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.116229 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c71e5b77-e090-4fdd-a254-387c5f9c5fba-config-data-custom\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.116248 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c71e5b77-e090-4fdd-a254-387c5f9c5fba-logs\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.116277 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c71e5b77-e090-4fdd-a254-387c5f9c5fba-scripts\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.218135 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71e5b77-e090-4fdd-a254-387c5f9c5fba-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.218183 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9wq8\" (UniqueName: \"kubernetes.io/projected/c71e5b77-e090-4fdd-a254-387c5f9c5fba-kube-api-access-v9wq8\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.218206 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c71e5b77-e090-4fdd-a254-387c5f9c5fba-config-data\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.218251 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71e5b77-e090-4fdd-a254-387c5f9c5fba-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.218274 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c71e5b77-e090-4fdd-a254-387c5f9c5fba-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.218296 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71e5b77-e090-4fdd-a254-387c5f9c5fba-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.218311 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c71e5b77-e090-4fdd-a254-387c5f9c5fba-config-data-custom\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.218325 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c71e5b77-e090-4fdd-a254-387c5f9c5fba-logs\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.218343 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c71e5b77-e090-4fdd-a254-387c5f9c5fba-scripts\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.222146 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c71e5b77-e090-4fdd-a254-387c5f9c5fba-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.224884 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c71e5b77-e090-4fdd-a254-387c5f9c5fba-logs\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.225121 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71e5b77-e090-4fdd-a254-387c5f9c5fba-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.228028 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71e5b77-e090-4fdd-a254-387c5f9c5fba-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.228318 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c71e5b77-e090-4fdd-a254-387c5f9c5fba-scripts\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.229315 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71e5b77-e090-4fdd-a254-387c5f9c5fba-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.229419 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c71e5b77-e090-4fdd-a254-387c5f9c5fba-config-data-custom\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.236514 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c71e5b77-e090-4fdd-a254-387c5f9c5fba-config-data\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.238437 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9wq8\" (UniqueName: \"kubernetes.io/projected/c71e5b77-e090-4fdd-a254-387c5f9c5fba-kube-api-access-v9wq8\") pod \"cinder-api-0\" (UID: \"c71e5b77-e090-4fdd-a254-387c5f9c5fba\") " pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.311861 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.567699 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-58d9c4988b-2fdgd" podUID="f42bd489-b6c5-4f24-8da8-2b01860a71d2" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.568543 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-58d9c4988b-2fdgd" podUID="f42bd489-b6c5-4f24-8da8-2b01860a71d2" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.852200 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35ebb9d5-af37-425c-b29b-c4f98eab213a" path="/var/lib/kubelet/pods/35ebb9d5-af37-425c-b29b-c4f98eab213a/volumes" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.858731 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9" path="/var/lib/kubelet/pods/aeb1ca71-a772-47b0-8ef8-7355cb2d2fc9/volumes" Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.876954 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"318a4560-9e79-46fe-96bf-aaa534848b45","Type":"ContainerStarted","Data":"e77cb88d4067f405f4fafdeb06849446216517878074e66a98bd78d9f3b471ba"} Dec 01 09:29:30 crc kubenswrapper[4867]: I1201 09:29:30.991224 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 09:29:31 crc kubenswrapper[4867]: I1201 09:29:31.575036 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-58d9c4988b-2fdgd" podUID="f42bd489-b6c5-4f24-8da8-2b01860a71d2" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:29:31 crc kubenswrapper[4867]: I1201 09:29:31.575043 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-58d9c4988b-2fdgd" podUID="f42bd489-b6c5-4f24-8da8-2b01860a71d2" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:29:31 crc kubenswrapper[4867]: I1201 09:29:31.896876 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c71e5b77-e090-4fdd-a254-387c5f9c5fba","Type":"ContainerStarted","Data":"fc83c86419e64223dc3e373b0dc94fd805326327e8347573fda57b446db0108b"} Dec 01 09:29:31 crc kubenswrapper[4867]: I1201 09:29:31.896919 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c71e5b77-e090-4fdd-a254-387c5f9c5fba","Type":"ContainerStarted","Data":"7e16ceda92f2070256663a89ca697361ad7dbbf27f1c0660099c85d95aa721f7"} Dec 01 09:29:31 crc kubenswrapper[4867]: I1201 09:29:31.920058 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"318a4560-9e79-46fe-96bf-aaa534848b45","Type":"ContainerStarted","Data":"b2561a8a1932da1b4d94559b1824ad5e60ff376b1697831aa0d6039cb12737f0"} Dec 01 09:29:32 crc kubenswrapper[4867]: I1201 09:29:32.936008 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"318a4560-9e79-46fe-96bf-aaa534848b45","Type":"ContainerStarted","Data":"d53a66d16cfe9b2aaf0a818646fdad9bf43c42fb7929ee3c443e40dde294ee35"} Dec 01 09:29:32 crc kubenswrapper[4867]: I1201 09:29:32.937298 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 09:29:32 crc kubenswrapper[4867]: I1201 09:29:32.942466 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c71e5b77-e090-4fdd-a254-387c5f9c5fba","Type":"ContainerStarted","Data":"8f38555df3b7b872643df9cda6c5862790e27c05edd05d80d7359c0293f945c7"} Dec 01 09:29:32 crc kubenswrapper[4867]: I1201 09:29:32.942569 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 01 09:29:33 crc kubenswrapper[4867]: I1201 09:29:33.042595 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.631089185 podStartE2EDuration="8.042572935s" podCreationTimestamp="2025-12-01 09:29:25 +0000 UTC" firstStartedPulling="2025-12-01 09:29:27.109510607 +0000 UTC m=+1288.568897361" lastFinishedPulling="2025-12-01 09:29:32.520994357 +0000 UTC m=+1293.980381111" observedRunningTime="2025-12-01 09:29:32.960626962 +0000 UTC m=+1294.420013716" watchObservedRunningTime="2025-12-01 09:29:33.042572935 +0000 UTC m=+1294.501959689" Dec 01 09:29:33 crc kubenswrapper[4867]: I1201 09:29:33.055625 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.055598611 podStartE2EDuration="4.055598611s" podCreationTimestamp="2025-12-01 09:29:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:29:32.985427271 +0000 UTC m=+1294.444814035" watchObservedRunningTime="2025-12-01 09:29:33.055598611 +0000 UTC m=+1294.514985365" Dec 01 09:29:34 crc kubenswrapper[4867]: I1201 09:29:34.312398 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:34 crc kubenswrapper[4867]: I1201 09:29:34.977008 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:29:35 crc kubenswrapper[4867]: I1201 09:29:35.072337 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-s7kjh"] Dec 01 09:29:35 crc kubenswrapper[4867]: I1201 09:29:35.072556 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" podUID="3e8123ef-03c8-4e49-b631-d4d90f54c8d0" containerName="dnsmasq-dns" containerID="cri-o://972971977dedf38a39bb58786ff5bf497a0a17d55c293f2e13921de6020e1309" gracePeriod=10 Dec 01 09:29:35 crc kubenswrapper[4867]: I1201 09:29:35.221285 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 01 09:29:35 crc kubenswrapper[4867]: I1201 09:29:35.317283 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 09:29:35 crc kubenswrapper[4867]: I1201 09:29:35.980362 4867 generic.go:334] "Generic (PLEG): container finished" podID="3e8123ef-03c8-4e49-b631-d4d90f54c8d0" containerID="972971977dedf38a39bb58786ff5bf497a0a17d55c293f2e13921de6020e1309" exitCode=0 Dec 01 09:29:35 crc kubenswrapper[4867]: I1201 09:29:35.980968 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="031c6adf-727b-441f-b977-6feacc9a2c31" containerName="cinder-scheduler" containerID="cri-o://aee1b40aa22d0bdc5c7ff47b30cfb5c67fc10f14fe4b04998cc5977244622b65" gracePeriod=30 Dec 01 09:29:35 crc kubenswrapper[4867]: I1201 09:29:35.981896 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" event={"ID":"3e8123ef-03c8-4e49-b631-d4d90f54c8d0","Type":"ContainerDied","Data":"972971977dedf38a39bb58786ff5bf497a0a17d55c293f2e13921de6020e1309"} Dec 01 09:29:35 crc kubenswrapper[4867]: I1201 09:29:35.981952 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" event={"ID":"3e8123ef-03c8-4e49-b631-d4d90f54c8d0","Type":"ContainerDied","Data":"86768a8b72610851b6e26224864546d5e4b377e983ed4685cbea74a8648db0b1"} Dec 01 09:29:35 crc kubenswrapper[4867]: I1201 09:29:35.981964 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86768a8b72610851b6e26224864546d5e4b377e983ed4685cbea74a8648db0b1" Dec 01 09:29:35 crc kubenswrapper[4867]: I1201 09:29:35.982350 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="031c6adf-727b-441f-b977-6feacc9a2c31" containerName="probe" containerID="cri-o://458329e1da247da42e33f9a12e9a03ca8f903900cf7d44d2005d065cfb4813ce" gracePeriod=30 Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.036230 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.160663 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-dns-swift-storage-0\") pod \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.160888 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-config\") pod \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.160955 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-ovsdbserver-nb\") pod \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.161012 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-dns-svc\") pod \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.161032 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mzrl\" (UniqueName: \"kubernetes.io/projected/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-kube-api-access-6mzrl\") pod \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.161103 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-ovsdbserver-sb\") pod \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\" (UID: \"3e8123ef-03c8-4e49-b631-d4d90f54c8d0\") " Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.189607 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-kube-api-access-6mzrl" (OuterVolumeSpecName: "kube-api-access-6mzrl") pod "3e8123ef-03c8-4e49-b631-d4d90f54c8d0" (UID: "3e8123ef-03c8-4e49-b631-d4d90f54c8d0"). InnerVolumeSpecName "kube-api-access-6mzrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.262718 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mzrl\" (UniqueName: \"kubernetes.io/projected/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-kube-api-access-6mzrl\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.288622 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3e8123ef-03c8-4e49-b631-d4d90f54c8d0" (UID: "3e8123ef-03c8-4e49-b631-d4d90f54c8d0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.288680 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3e8123ef-03c8-4e49-b631-d4d90f54c8d0" (UID: "3e8123ef-03c8-4e49-b631-d4d90f54c8d0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.308288 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-config" (OuterVolumeSpecName: "config") pod "3e8123ef-03c8-4e49-b631-d4d90f54c8d0" (UID: "3e8123ef-03c8-4e49-b631-d4d90f54c8d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.363931 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.363966 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.363975 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.370403 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3e8123ef-03c8-4e49-b631-d4d90f54c8d0" (UID: "3e8123ef-03c8-4e49-b631-d4d90f54c8d0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.380754 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3e8123ef-03c8-4e49-b631-d4d90f54c8d0" (UID: "3e8123ef-03c8-4e49-b631-d4d90f54c8d0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.465179 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.465210 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e8123ef-03c8-4e49-b631-d4d90f54c8d0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.617104 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-58d9c4988b-2fdgd" podUID="f42bd489-b6c5-4f24-8da8-2b01860a71d2" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:29:36 crc kubenswrapper[4867]: I1201 09:29:36.621571 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.006709 4867 generic.go:334] "Generic (PLEG): container finished" podID="031c6adf-727b-441f-b977-6feacc9a2c31" containerID="458329e1da247da42e33f9a12e9a03ca8f903900cf7d44d2005d065cfb4813ce" exitCode=0 Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.006994 4867 generic.go:334] "Generic (PLEG): container finished" podID="031c6adf-727b-441f-b977-6feacc9a2c31" containerID="aee1b40aa22d0bdc5c7ff47b30cfb5c67fc10f14fe4b04998cc5977244622b65" exitCode=0 Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.007067 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-s7kjh" Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.007753 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"031c6adf-727b-441f-b977-6feacc9a2c31","Type":"ContainerDied","Data":"458329e1da247da42e33f9a12e9a03ca8f903900cf7d44d2005d065cfb4813ce"} Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.007780 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"031c6adf-727b-441f-b977-6feacc9a2c31","Type":"ContainerDied","Data":"aee1b40aa22d0bdc5c7ff47b30cfb5c67fc10f14fe4b04998cc5977244622b65"} Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.042289 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-s7kjh"] Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.054987 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-s7kjh"] Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.645598 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.793565 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4p8x\" (UniqueName: \"kubernetes.io/projected/031c6adf-727b-441f-b977-6feacc9a2c31-kube-api-access-v4p8x\") pod \"031c6adf-727b-441f-b977-6feacc9a2c31\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.794474 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-config-data\") pod \"031c6adf-727b-441f-b977-6feacc9a2c31\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.794781 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-combined-ca-bundle\") pod \"031c6adf-727b-441f-b977-6feacc9a2c31\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.794829 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-config-data-custom\") pod \"031c6adf-727b-441f-b977-6feacc9a2c31\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.794858 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-scripts\") pod \"031c6adf-727b-441f-b977-6feacc9a2c31\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.794877 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/031c6adf-727b-441f-b977-6feacc9a2c31-etc-machine-id\") pod \"031c6adf-727b-441f-b977-6feacc9a2c31\" (UID: \"031c6adf-727b-441f-b977-6feacc9a2c31\") " Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.795158 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/031c6adf-727b-441f-b977-6feacc9a2c31-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "031c6adf-727b-441f-b977-6feacc9a2c31" (UID: "031c6adf-727b-441f-b977-6feacc9a2c31"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.795427 4867 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/031c6adf-727b-441f-b977-6feacc9a2c31-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.804049 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "031c6adf-727b-441f-b977-6feacc9a2c31" (UID: "031c6adf-727b-441f-b977-6feacc9a2c31"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.822083 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/031c6adf-727b-441f-b977-6feacc9a2c31-kube-api-access-v4p8x" (OuterVolumeSpecName: "kube-api-access-v4p8x") pod "031c6adf-727b-441f-b977-6feacc9a2c31" (UID: "031c6adf-727b-441f-b977-6feacc9a2c31"). InnerVolumeSpecName "kube-api-access-v4p8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.827998 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-scripts" (OuterVolumeSpecName: "scripts") pod "031c6adf-727b-441f-b977-6feacc9a2c31" (UID: "031c6adf-727b-441f-b977-6feacc9a2c31"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.879989 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "031c6adf-727b-441f-b977-6feacc9a2c31" (UID: "031c6adf-727b-441f-b977-6feacc9a2c31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.898329 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.899304 4867 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.899416 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.899475 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4p8x\" (UniqueName: \"kubernetes.io/projected/031c6adf-727b-441f-b977-6feacc9a2c31-kube-api-access-v4p8x\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:37 crc kubenswrapper[4867]: I1201 09:29:37.967984 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-config-data" (OuterVolumeSpecName: "config-data") pod "031c6adf-727b-441f-b977-6feacc9a2c31" (UID: "031c6adf-727b-441f-b977-6feacc9a2c31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.001382 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031c6adf-727b-441f-b977-6feacc9a2c31-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.016038 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"031c6adf-727b-441f-b977-6feacc9a2c31","Type":"ContainerDied","Data":"a40c91a76ecd87eb65f16a343b8bad5d889b0184197ceb233d7fd79c21072ec4"} Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.016443 4867 scope.go:117] "RemoveContainer" containerID="458329e1da247da42e33f9a12e9a03ca8f903900cf7d44d2005d065cfb4813ce" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.016662 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.086942 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.105637 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.106560 4867 scope.go:117] "RemoveContainer" containerID="aee1b40aa22d0bdc5c7ff47b30cfb5c67fc10f14fe4b04998cc5977244622b65" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.156134 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 09:29:38 crc kubenswrapper[4867]: E1201 09:29:38.156578 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e8123ef-03c8-4e49-b631-d4d90f54c8d0" containerName="dnsmasq-dns" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.156591 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e8123ef-03c8-4e49-b631-d4d90f54c8d0" containerName="dnsmasq-dns" Dec 01 09:29:38 crc kubenswrapper[4867]: E1201 09:29:38.156616 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031c6adf-727b-441f-b977-6feacc9a2c31" containerName="cinder-scheduler" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.156623 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="031c6adf-727b-441f-b977-6feacc9a2c31" containerName="cinder-scheduler" Dec 01 09:29:38 crc kubenswrapper[4867]: E1201 09:29:38.156634 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e8123ef-03c8-4e49-b631-d4d90f54c8d0" containerName="init" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.156640 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e8123ef-03c8-4e49-b631-d4d90f54c8d0" containerName="init" Dec 01 09:29:38 crc kubenswrapper[4867]: E1201 09:29:38.156656 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031c6adf-727b-441f-b977-6feacc9a2c31" containerName="probe" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.156661 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="031c6adf-727b-441f-b977-6feacc9a2c31" containerName="probe" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.156836 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e8123ef-03c8-4e49-b631-d4d90f54c8d0" containerName="dnsmasq-dns" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.156847 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="031c6adf-727b-441f-b977-6feacc9a2c31" containerName="cinder-scheduler" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.156857 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="031c6adf-727b-441f-b977-6feacc9a2c31" containerName="probe" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.157791 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.161782 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.205152 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.213836 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qzxv\" (UniqueName: \"kubernetes.io/projected/9fe6c397-9427-4440-9d14-b0397c62f8ea-kube-api-access-8qzxv\") pod \"cinder-scheduler-0\" (UID: \"9fe6c397-9427-4440-9d14-b0397c62f8ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.213900 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fe6c397-9427-4440-9d14-b0397c62f8ea-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9fe6c397-9427-4440-9d14-b0397c62f8ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.213952 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fe6c397-9427-4440-9d14-b0397c62f8ea-scripts\") pod \"cinder-scheduler-0\" (UID: \"9fe6c397-9427-4440-9d14-b0397c62f8ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.213977 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9fe6c397-9427-4440-9d14-b0397c62f8ea-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9fe6c397-9427-4440-9d14-b0397c62f8ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.214030 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fe6c397-9427-4440-9d14-b0397c62f8ea-config-data\") pod \"cinder-scheduler-0\" (UID: \"9fe6c397-9427-4440-9d14-b0397c62f8ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.214097 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fe6c397-9427-4440-9d14-b0397c62f8ea-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9fe6c397-9427-4440-9d14-b0397c62f8ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.314977 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fe6c397-9427-4440-9d14-b0397c62f8ea-config-data\") pod \"cinder-scheduler-0\" (UID: \"9fe6c397-9427-4440-9d14-b0397c62f8ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.315041 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fe6c397-9427-4440-9d14-b0397c62f8ea-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9fe6c397-9427-4440-9d14-b0397c62f8ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.315112 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qzxv\" (UniqueName: \"kubernetes.io/projected/9fe6c397-9427-4440-9d14-b0397c62f8ea-kube-api-access-8qzxv\") pod \"cinder-scheduler-0\" (UID: \"9fe6c397-9427-4440-9d14-b0397c62f8ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.315136 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fe6c397-9427-4440-9d14-b0397c62f8ea-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9fe6c397-9427-4440-9d14-b0397c62f8ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.315166 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fe6c397-9427-4440-9d14-b0397c62f8ea-scripts\") pod \"cinder-scheduler-0\" (UID: \"9fe6c397-9427-4440-9d14-b0397c62f8ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.315183 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9fe6c397-9427-4440-9d14-b0397c62f8ea-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9fe6c397-9427-4440-9d14-b0397c62f8ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.315271 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9fe6c397-9427-4440-9d14-b0397c62f8ea-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9fe6c397-9427-4440-9d14-b0397c62f8ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.320640 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fe6c397-9427-4440-9d14-b0397c62f8ea-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9fe6c397-9427-4440-9d14-b0397c62f8ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.322178 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fe6c397-9427-4440-9d14-b0397c62f8ea-config-data\") pod \"cinder-scheduler-0\" (UID: \"9fe6c397-9427-4440-9d14-b0397c62f8ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.322681 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fe6c397-9427-4440-9d14-b0397c62f8ea-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9fe6c397-9427-4440-9d14-b0397c62f8ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.323268 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fe6c397-9427-4440-9d14-b0397c62f8ea-scripts\") pod \"cinder-scheduler-0\" (UID: \"9fe6c397-9427-4440-9d14-b0397c62f8ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.339494 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qzxv\" (UniqueName: \"kubernetes.io/projected/9fe6c397-9427-4440-9d14-b0397c62f8ea-kube-api-access-8qzxv\") pod \"cinder-scheduler-0\" (UID: \"9fe6c397-9427-4440-9d14-b0397c62f8ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.503759 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.859568 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="031c6adf-727b-441f-b977-6feacc9a2c31" path="/var/lib/kubelet/pods/031c6adf-727b-441f-b977-6feacc9a2c31/volumes" Dec 01 09:29:38 crc kubenswrapper[4867]: I1201 09:29:38.860474 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e8123ef-03c8-4e49-b631-d4d90f54c8d0" path="/var/lib/kubelet/pods/3e8123ef-03c8-4e49-b631-d4d90f54c8d0/volumes" Dec 01 09:29:39 crc kubenswrapper[4867]: I1201 09:29:39.184646 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 09:29:39 crc kubenswrapper[4867]: I1201 09:29:39.543297 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7b56ffdd7f-kp95s" Dec 01 09:29:39 crc kubenswrapper[4867]: I1201 09:29:39.575172 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:39 crc kubenswrapper[4867]: I1201 09:29:39.853136 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b97bc66cd-p4vv6" Dec 01 09:29:39 crc kubenswrapper[4867]: I1201 09:29:39.940808 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-58d9c4988b-2fdgd"] Dec 01 09:29:39 crc kubenswrapper[4867]: I1201 09:29:39.941102 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-58d9c4988b-2fdgd" podUID="f42bd489-b6c5-4f24-8da8-2b01860a71d2" containerName="barbican-api-log" containerID="cri-o://3ed3f3b529ee396bf3fa49e37b14481e623bda8a5e55f93e6fb805af421a53be" gracePeriod=30 Dec 01 09:29:39 crc kubenswrapper[4867]: I1201 09:29:39.941594 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-58d9c4988b-2fdgd" podUID="f42bd489-b6c5-4f24-8da8-2b01860a71d2" containerName="barbican-api" containerID="cri-o://88f1c5bc5c9ed33faa44dd06325335269e8afa668886978d4cb67a4510f9a59c" gracePeriod=30 Dec 01 09:29:40 crc kubenswrapper[4867]: I1201 09:29:40.062262 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9fe6c397-9427-4440-9d14-b0397c62f8ea","Type":"ContainerStarted","Data":"e3c5fa3e93993e93e88828af8b549ecdd810e9854c6e5ffa154d6b0ad2868f65"} Dec 01 09:29:41 crc kubenswrapper[4867]: I1201 09:29:41.082828 4867 generic.go:334] "Generic (PLEG): container finished" podID="f42bd489-b6c5-4f24-8da8-2b01860a71d2" containerID="3ed3f3b529ee396bf3fa49e37b14481e623bda8a5e55f93e6fb805af421a53be" exitCode=143 Dec 01 09:29:41 crc kubenswrapper[4867]: I1201 09:29:41.083240 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58d9c4988b-2fdgd" event={"ID":"f42bd489-b6c5-4f24-8da8-2b01860a71d2","Type":"ContainerDied","Data":"3ed3f3b529ee396bf3fa49e37b14481e623bda8a5e55f93e6fb805af421a53be"} Dec 01 09:29:41 crc kubenswrapper[4867]: I1201 09:29:41.087489 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9fe6c397-9427-4440-9d14-b0397c62f8ea","Type":"ContainerStarted","Data":"dbe2cc22afe5bc82d264ba698c548b34154c3f03c2fe318b1c740c8a10629914"} Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.136017 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9fe6c397-9427-4440-9d14-b0397c62f8ea","Type":"ContainerStarted","Data":"fa88890a687c3dc644ff514ec5b98b55bf5adfdcc16c695a811d4fe4709e031b"} Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.176943 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.176923573 podStartE2EDuration="4.176923573s" podCreationTimestamp="2025-12-01 09:29:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:29:42.176373227 +0000 UTC m=+1303.635759981" watchObservedRunningTime="2025-12-01 09:29:42.176923573 +0000 UTC m=+1303.636310327" Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.297907 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.299375 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.304288 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-ghxzl" Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.305694 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.327459 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.385858 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.401066 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p245h\" (UniqueName: \"kubernetes.io/projected/e2fbdcd3-0c11-4681-99af-c9b4fb717637-kube-api-access-p245h\") pod \"openstackclient\" (UID: \"e2fbdcd3-0c11-4681-99af-c9b4fb717637\") " pod="openstack/openstackclient" Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.401165 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e2fbdcd3-0c11-4681-99af-c9b4fb717637-openstack-config-secret\") pod \"openstackclient\" (UID: \"e2fbdcd3-0c11-4681-99af-c9b4fb717637\") " pod="openstack/openstackclient" Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.401246 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e2fbdcd3-0c11-4681-99af-c9b4fb717637-openstack-config\") pod \"openstackclient\" (UID: \"e2fbdcd3-0c11-4681-99af-c9b4fb717637\") " pod="openstack/openstackclient" Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.401418 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fbdcd3-0c11-4681-99af-c9b4fb717637-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e2fbdcd3-0c11-4681-99af-c9b4fb717637\") " pod="openstack/openstackclient" Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.502688 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fbdcd3-0c11-4681-99af-c9b4fb717637-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e2fbdcd3-0c11-4681-99af-c9b4fb717637\") " pod="openstack/openstackclient" Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.502758 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p245h\" (UniqueName: \"kubernetes.io/projected/e2fbdcd3-0c11-4681-99af-c9b4fb717637-kube-api-access-p245h\") pod \"openstackclient\" (UID: \"e2fbdcd3-0c11-4681-99af-c9b4fb717637\") " pod="openstack/openstackclient" Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.502794 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e2fbdcd3-0c11-4681-99af-c9b4fb717637-openstack-config-secret\") pod \"openstackclient\" (UID: \"e2fbdcd3-0c11-4681-99af-c9b4fb717637\") " pod="openstack/openstackclient" Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.502848 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e2fbdcd3-0c11-4681-99af-c9b4fb717637-openstack-config\") pod \"openstackclient\" (UID: \"e2fbdcd3-0c11-4681-99af-c9b4fb717637\") " pod="openstack/openstackclient" Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.503829 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e2fbdcd3-0c11-4681-99af-c9b4fb717637-openstack-config\") pod \"openstackclient\" (UID: \"e2fbdcd3-0c11-4681-99af-c9b4fb717637\") " pod="openstack/openstackclient" Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.509751 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fbdcd3-0c11-4681-99af-c9b4fb717637-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e2fbdcd3-0c11-4681-99af-c9b4fb717637\") " pod="openstack/openstackclient" Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.511295 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e2fbdcd3-0c11-4681-99af-c9b4fb717637-openstack-config-secret\") pod \"openstackclient\" (UID: \"e2fbdcd3-0c11-4681-99af-c9b4fb717637\") " pod="openstack/openstackclient" Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.533933 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p245h\" (UniqueName: \"kubernetes.io/projected/e2fbdcd3-0c11-4681-99af-c9b4fb717637-kube-api-access-p245h\") pod \"openstackclient\" (UID: \"e2fbdcd3-0c11-4681-99af-c9b4fb717637\") " pod="openstack/openstackclient" Dec 01 09:29:42 crc kubenswrapper[4867]: I1201 09:29:42.628308 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 09:29:43 crc kubenswrapper[4867]: I1201 09:29:43.298670 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-58d9c4988b-2fdgd" podUID="f42bd489-b6c5-4f24-8da8-2b01860a71d2" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:60808->10.217.0.158:9311: read: connection reset by peer" Dec 01 09:29:43 crc kubenswrapper[4867]: I1201 09:29:43.299320 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-58d9c4988b-2fdgd" podUID="f42bd489-b6c5-4f24-8da8-2b01860a71d2" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:60810->10.217.0.158:9311: read: connection reset by peer" Dec 01 09:29:43 crc kubenswrapper[4867]: I1201 09:29:43.504656 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 01 09:29:44 crc kubenswrapper[4867]: I1201 09:29:44.310288 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 09:29:44 crc kubenswrapper[4867]: I1201 09:29:44.320025 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="c71e5b77-e090-4fdd-a254-387c5f9c5fba" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.164:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 09:29:44 crc kubenswrapper[4867]: I1201 09:29:44.997044 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.174784 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f42bd489-b6c5-4f24-8da8-2b01860a71d2-combined-ca-bundle\") pod \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\" (UID: \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\") " Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.175593 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f42bd489-b6c5-4f24-8da8-2b01860a71d2-logs\") pod \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\" (UID: \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\") " Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.175719 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f42bd489-b6c5-4f24-8da8-2b01860a71d2-config-data-custom\") pod \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\" (UID: \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\") " Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.175907 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmcl8\" (UniqueName: \"kubernetes.io/projected/f42bd489-b6c5-4f24-8da8-2b01860a71d2-kube-api-access-dmcl8\") pod \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\" (UID: \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\") " Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.175986 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f42bd489-b6c5-4f24-8da8-2b01860a71d2-config-data\") pod \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\" (UID: \"f42bd489-b6c5-4f24-8da8-2b01860a71d2\") " Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.176729 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f42bd489-b6c5-4f24-8da8-2b01860a71d2-logs" (OuterVolumeSpecName: "logs") pod "f42bd489-b6c5-4f24-8da8-2b01860a71d2" (UID: "f42bd489-b6c5-4f24-8da8-2b01860a71d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.184922 4867 generic.go:334] "Generic (PLEG): container finished" podID="f42bd489-b6c5-4f24-8da8-2b01860a71d2" containerID="88f1c5bc5c9ed33faa44dd06325335269e8afa668886978d4cb67a4510f9a59c" exitCode=0 Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.184977 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58d9c4988b-2fdgd" event={"ID":"f42bd489-b6c5-4f24-8da8-2b01860a71d2","Type":"ContainerDied","Data":"88f1c5bc5c9ed33faa44dd06325335269e8afa668886978d4cb67a4510f9a59c"} Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.185016 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58d9c4988b-2fdgd" Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.185043 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58d9c4988b-2fdgd" event={"ID":"f42bd489-b6c5-4f24-8da8-2b01860a71d2","Type":"ContainerDied","Data":"5c6315d1c4d92454d541c51b1f6ffce5eb2a5698612bce3c8c6adc1f134904c9"} Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.185082 4867 scope.go:117] "RemoveContainer" containerID="88f1c5bc5c9ed33faa44dd06325335269e8afa668886978d4cb67a4510f9a59c" Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.188794 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f42bd489-b6c5-4f24-8da8-2b01860a71d2-kube-api-access-dmcl8" (OuterVolumeSpecName: "kube-api-access-dmcl8") pod "f42bd489-b6c5-4f24-8da8-2b01860a71d2" (UID: "f42bd489-b6c5-4f24-8da8-2b01860a71d2"). InnerVolumeSpecName "kube-api-access-dmcl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.188937 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"e2fbdcd3-0c11-4681-99af-c9b4fb717637","Type":"ContainerStarted","Data":"bc6e2bcc3fa13ae5f7485b998d6bb0edf824436f22ad218363f6d8575b65daed"} Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.202695 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f42bd489-b6c5-4f24-8da8-2b01860a71d2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f42bd489-b6c5-4f24-8da8-2b01860a71d2" (UID: "f42bd489-b6c5-4f24-8da8-2b01860a71d2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.231274 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f42bd489-b6c5-4f24-8da8-2b01860a71d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f42bd489-b6c5-4f24-8da8-2b01860a71d2" (UID: "f42bd489-b6c5-4f24-8da8-2b01860a71d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.273802 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f42bd489-b6c5-4f24-8da8-2b01860a71d2-config-data" (OuterVolumeSpecName: "config-data") pod "f42bd489-b6c5-4f24-8da8-2b01860a71d2" (UID: "f42bd489-b6c5-4f24-8da8-2b01860a71d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.278101 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f42bd489-b6c5-4f24-8da8-2b01860a71d2-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.278132 4867 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f42bd489-b6c5-4f24-8da8-2b01860a71d2-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.278143 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmcl8\" (UniqueName: \"kubernetes.io/projected/f42bd489-b6c5-4f24-8da8-2b01860a71d2-kube-api-access-dmcl8\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.278152 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f42bd489-b6c5-4f24-8da8-2b01860a71d2-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.278160 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f42bd489-b6c5-4f24-8da8-2b01860a71d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.307353 4867 scope.go:117] "RemoveContainer" containerID="3ed3f3b529ee396bf3fa49e37b14481e623bda8a5e55f93e6fb805af421a53be" Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.317962 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="c71e5b77-e090-4fdd-a254-387c5f9c5fba" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.164:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.356754 4867 scope.go:117] "RemoveContainer" containerID="88f1c5bc5c9ed33faa44dd06325335269e8afa668886978d4cb67a4510f9a59c" Dec 01 09:29:45 crc kubenswrapper[4867]: E1201 09:29:45.357264 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88f1c5bc5c9ed33faa44dd06325335269e8afa668886978d4cb67a4510f9a59c\": container with ID starting with 88f1c5bc5c9ed33faa44dd06325335269e8afa668886978d4cb67a4510f9a59c not found: ID does not exist" containerID="88f1c5bc5c9ed33faa44dd06325335269e8afa668886978d4cb67a4510f9a59c" Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.357303 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f1c5bc5c9ed33faa44dd06325335269e8afa668886978d4cb67a4510f9a59c"} err="failed to get container status \"88f1c5bc5c9ed33faa44dd06325335269e8afa668886978d4cb67a4510f9a59c\": rpc error: code = NotFound desc = could not find container \"88f1c5bc5c9ed33faa44dd06325335269e8afa668886978d4cb67a4510f9a59c\": container with ID starting with 88f1c5bc5c9ed33faa44dd06325335269e8afa668886978d4cb67a4510f9a59c not found: ID does not exist" Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.357330 4867 scope.go:117] "RemoveContainer" containerID="3ed3f3b529ee396bf3fa49e37b14481e623bda8a5e55f93e6fb805af421a53be" Dec 01 09:29:45 crc kubenswrapper[4867]: E1201 09:29:45.357628 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ed3f3b529ee396bf3fa49e37b14481e623bda8a5e55f93e6fb805af421a53be\": container with ID starting with 3ed3f3b529ee396bf3fa49e37b14481e623bda8a5e55f93e6fb805af421a53be not found: ID does not exist" containerID="3ed3f3b529ee396bf3fa49e37b14481e623bda8a5e55f93e6fb805af421a53be" Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.357709 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ed3f3b529ee396bf3fa49e37b14481e623bda8a5e55f93e6fb805af421a53be"} err="failed to get container status \"3ed3f3b529ee396bf3fa49e37b14481e623bda8a5e55f93e6fb805af421a53be\": rpc error: code = NotFound desc = could not find container \"3ed3f3b529ee396bf3fa49e37b14481e623bda8a5e55f93e6fb805af421a53be\": container with ID starting with 3ed3f3b529ee396bf3fa49e37b14481e623bda8a5e55f93e6fb805af421a53be not found: ID does not exist" Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.527055 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-58d9c4988b-2fdgd"] Dec 01 09:29:45 crc kubenswrapper[4867]: I1201 09:29:45.536698 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-58d9c4988b-2fdgd"] Dec 01 09:29:46 crc kubenswrapper[4867]: I1201 09:29:46.841903 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f42bd489-b6c5-4f24-8da8-2b01860a71d2" path="/var/lib/kubelet/pods/f42bd489-b6c5-4f24-8da8-2b01860a71d2/volumes" Dec 01 09:29:48 crc kubenswrapper[4867]: I1201 09:29:48.409369 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 01 09:29:48 crc kubenswrapper[4867]: I1201 09:29:48.870686 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.163087 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.163432 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="318a4560-9e79-46fe-96bf-aaa534848b45" containerName="ceilometer-central-agent" containerID="cri-o://955ae7214111cfad62bad5b8d44104f863a69864fcd2ddd389e921e3b8c7e03f" gracePeriod=30 Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.165271 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="318a4560-9e79-46fe-96bf-aaa534848b45" containerName="sg-core" containerID="cri-o://b2561a8a1932da1b4d94559b1824ad5e60ff376b1697831aa0d6039cb12737f0" gracePeriod=30 Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.165399 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="318a4560-9e79-46fe-96bf-aaa534848b45" containerName="proxy-httpd" containerID="cri-o://d53a66d16cfe9b2aaf0a818646fdad9bf43c42fb7929ee3c443e40dde294ee35" gracePeriod=30 Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.165481 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="318a4560-9e79-46fe-96bf-aaa534848b45" containerName="ceilometer-notification-agent" containerID="cri-o://e77cb88d4067f405f4fafdeb06849446216517878074e66a98bd78d9f3b471ba" gracePeriod=30 Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.235357 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="318a4560-9e79-46fe-96bf-aaa534848b45" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.163:3000/\": EOF" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.626709 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-9dcc6b98f-chkvz"] Dec 01 09:29:50 crc kubenswrapper[4867]: E1201 09:29:50.627396 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f42bd489-b6c5-4f24-8da8-2b01860a71d2" containerName="barbican-api" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.627479 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f42bd489-b6c5-4f24-8da8-2b01860a71d2" containerName="barbican-api" Dec 01 09:29:50 crc kubenswrapper[4867]: E1201 09:29:50.627554 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f42bd489-b6c5-4f24-8da8-2b01860a71d2" containerName="barbican-api-log" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.627610 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f42bd489-b6c5-4f24-8da8-2b01860a71d2" containerName="barbican-api-log" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.627930 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f42bd489-b6c5-4f24-8da8-2b01860a71d2" containerName="barbican-api" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.628055 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f42bd489-b6c5-4f24-8da8-2b01860a71d2" containerName="barbican-api-log" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.629184 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.636655 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.636897 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.637052 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.673030 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-9dcc6b98f-chkvz"] Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.783215 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/476caa3a-28ba-471d-b4c0-c263c5960a87-etc-swift\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.783272 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/476caa3a-28ba-471d-b4c0-c263c5960a87-run-httpd\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.783364 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/476caa3a-28ba-471d-b4c0-c263c5960a87-combined-ca-bundle\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.783406 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/476caa3a-28ba-471d-b4c0-c263c5960a87-config-data\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.783453 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrcxp\" (UniqueName: \"kubernetes.io/projected/476caa3a-28ba-471d-b4c0-c263c5960a87-kube-api-access-jrcxp\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.783490 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/476caa3a-28ba-471d-b4c0-c263c5960a87-internal-tls-certs\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.783542 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/476caa3a-28ba-471d-b4c0-c263c5960a87-log-httpd\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.783651 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/476caa3a-28ba-471d-b4c0-c263c5960a87-public-tls-certs\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.885235 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/476caa3a-28ba-471d-b4c0-c263c5960a87-combined-ca-bundle\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.885317 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/476caa3a-28ba-471d-b4c0-c263c5960a87-config-data\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.885363 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrcxp\" (UniqueName: \"kubernetes.io/projected/476caa3a-28ba-471d-b4c0-c263c5960a87-kube-api-access-jrcxp\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.885392 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/476caa3a-28ba-471d-b4c0-c263c5960a87-internal-tls-certs\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.885439 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/476caa3a-28ba-471d-b4c0-c263c5960a87-log-httpd\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.885509 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/476caa3a-28ba-471d-b4c0-c263c5960a87-public-tls-certs\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.885593 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/476caa3a-28ba-471d-b4c0-c263c5960a87-etc-swift\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.885624 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/476caa3a-28ba-471d-b4c0-c263c5960a87-run-httpd\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.886175 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/476caa3a-28ba-471d-b4c0-c263c5960a87-run-httpd\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.886631 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/476caa3a-28ba-471d-b4c0-c263c5960a87-log-httpd\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.892374 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/476caa3a-28ba-471d-b4c0-c263c5960a87-internal-tls-certs\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.894714 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/476caa3a-28ba-471d-b4c0-c263c5960a87-config-data\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.902751 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/476caa3a-28ba-471d-b4c0-c263c5960a87-public-tls-certs\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.903959 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/476caa3a-28ba-471d-b4c0-c263c5960a87-combined-ca-bundle\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.904350 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/476caa3a-28ba-471d-b4c0-c263c5960a87-etc-swift\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.908691 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrcxp\" (UniqueName: \"kubernetes.io/projected/476caa3a-28ba-471d-b4c0-c263c5960a87-kube-api-access-jrcxp\") pod \"swift-proxy-9dcc6b98f-chkvz\" (UID: \"476caa3a-28ba-471d-b4c0-c263c5960a87\") " pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:50 crc kubenswrapper[4867]: I1201 09:29:50.980240 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:29:51 crc kubenswrapper[4867]: I1201 09:29:51.318740 4867 generic.go:334] "Generic (PLEG): container finished" podID="318a4560-9e79-46fe-96bf-aaa534848b45" containerID="d53a66d16cfe9b2aaf0a818646fdad9bf43c42fb7929ee3c443e40dde294ee35" exitCode=0 Dec 01 09:29:51 crc kubenswrapper[4867]: I1201 09:29:51.318780 4867 generic.go:334] "Generic (PLEG): container finished" podID="318a4560-9e79-46fe-96bf-aaa534848b45" containerID="b2561a8a1932da1b4d94559b1824ad5e60ff376b1697831aa0d6039cb12737f0" exitCode=2 Dec 01 09:29:51 crc kubenswrapper[4867]: I1201 09:29:51.318797 4867 generic.go:334] "Generic (PLEG): container finished" podID="318a4560-9e79-46fe-96bf-aaa534848b45" containerID="955ae7214111cfad62bad5b8d44104f863a69864fcd2ddd389e921e3b8c7e03f" exitCode=0 Dec 01 09:29:51 crc kubenswrapper[4867]: I1201 09:29:51.318846 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"318a4560-9e79-46fe-96bf-aaa534848b45","Type":"ContainerDied","Data":"d53a66d16cfe9b2aaf0a818646fdad9bf43c42fb7929ee3c443e40dde294ee35"} Dec 01 09:29:51 crc kubenswrapper[4867]: I1201 09:29:51.318879 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"318a4560-9e79-46fe-96bf-aaa534848b45","Type":"ContainerDied","Data":"b2561a8a1932da1b4d94559b1824ad5e60ff376b1697831aa0d6039cb12737f0"} Dec 01 09:29:51 crc kubenswrapper[4867]: I1201 09:29:51.318894 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"318a4560-9e79-46fe-96bf-aaa534848b45","Type":"ContainerDied","Data":"955ae7214111cfad62bad5b8d44104f863a69864fcd2ddd389e921e3b8c7e03f"} Dec 01 09:29:51 crc kubenswrapper[4867]: I1201 09:29:51.601717 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:29:51 crc kubenswrapper[4867]: I1201 09:29:51.602107 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:29:51 crc kubenswrapper[4867]: W1201 09:29:51.623572 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod476caa3a_28ba_471d_b4c0_c263c5960a87.slice/crio-8180262b285687d4ce8d6c561b714d60deb351ce9a13215cf53c48b3435d1527 WatchSource:0}: Error finding container 8180262b285687d4ce8d6c561b714d60deb351ce9a13215cf53c48b3435d1527: Status 404 returned error can't find the container with id 8180262b285687d4ce8d6c561b714d60deb351ce9a13215cf53c48b3435d1527 Dec 01 09:29:51 crc kubenswrapper[4867]: I1201 09:29:51.637121 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-9dcc6b98f-chkvz"] Dec 01 09:29:52 crc kubenswrapper[4867]: I1201 09:29:52.373757 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9dcc6b98f-chkvz" event={"ID":"476caa3a-28ba-471d-b4c0-c263c5960a87","Type":"ContainerStarted","Data":"8180262b285687d4ce8d6c561b714d60deb351ce9a13215cf53c48b3435d1527"} Dec 01 09:29:52 crc kubenswrapper[4867]: I1201 09:29:52.376591 4867 generic.go:334] "Generic (PLEG): container finished" podID="318a4560-9e79-46fe-96bf-aaa534848b45" containerID="e77cb88d4067f405f4fafdeb06849446216517878074e66a98bd78d9f3b471ba" exitCode=0 Dec 01 09:29:52 crc kubenswrapper[4867]: I1201 09:29:52.376616 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"318a4560-9e79-46fe-96bf-aaa534848b45","Type":"ContainerDied","Data":"e77cb88d4067f405f4fafdeb06849446216517878074e66a98bd78d9f3b471ba"} Dec 01 09:29:53 crc kubenswrapper[4867]: I1201 09:29:53.401185 4867 generic.go:334] "Generic (PLEG): container finished" podID="8bd4fac2-df2c-4aab-bf00-99b54a83ddca" containerID="9d03af7b1362790fa6ac6592121987809cbd79e15e394cba2fd458a1d0946120" exitCode=137 Dec 01 09:29:53 crc kubenswrapper[4867]: I1201 09:29:53.401343 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d47c7cb76-srf4p" event={"ID":"8bd4fac2-df2c-4aab-bf00-99b54a83ddca","Type":"ContainerDied","Data":"9d03af7b1362790fa6ac6592121987809cbd79e15e394cba2fd458a1d0946120"} Dec 01 09:29:53 crc kubenswrapper[4867]: I1201 09:29:53.410014 4867 generic.go:334] "Generic (PLEG): container finished" podID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerID="686c5303f0412b7b582b8c491b3e8223fe86fdd2e4836a2991c0f50fae8a3067" exitCode=137 Dec 01 09:29:53 crc kubenswrapper[4867]: I1201 09:29:53.410067 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c846795f4-k7mlj" event={"ID":"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e","Type":"ContainerDied","Data":"686c5303f0412b7b582b8c491b3e8223fe86fdd2e4836a2991c0f50fae8a3067"} Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.069714 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="318a4560-9e79-46fe-96bf-aaa534848b45" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.163:3000/\": dial tcp 10.217.0.163:3000: connect: connection refused" Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.519521 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-kwrll"] Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.521568 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kwrll" Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.606842 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89cd7bb8-e306-4d6d-96a9-33e00ed9f194-operator-scripts\") pod \"nova-api-db-create-kwrll\" (UID: \"89cd7bb8-e306-4d6d-96a9-33e00ed9f194\") " pod="openstack/nova-api-db-create-kwrll" Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.606963 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npr6m\" (UniqueName: \"kubernetes.io/projected/89cd7bb8-e306-4d6d-96a9-33e00ed9f194-kube-api-access-npr6m\") pod \"nova-api-db-create-kwrll\" (UID: \"89cd7bb8-e306-4d6d-96a9-33e00ed9f194\") " pod="openstack/nova-api-db-create-kwrll" Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.620888 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-kxgsc"] Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.622464 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kxgsc" Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.639034 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-kwrll"] Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.659019 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-kxgsc"] Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.724033 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89cd7bb8-e306-4d6d-96a9-33e00ed9f194-operator-scripts\") pod \"nova-api-db-create-kwrll\" (UID: \"89cd7bb8-e306-4d6d-96a9-33e00ed9f194\") " pod="openstack/nova-api-db-create-kwrll" Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.724165 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq8mw\" (UniqueName: \"kubernetes.io/projected/37bc51bf-0822-420e-8d4a-5cb236dd83e4-kube-api-access-lq8mw\") pod \"nova-cell0-db-create-kxgsc\" (UID: \"37bc51bf-0822-420e-8d4a-5cb236dd83e4\") " pod="openstack/nova-cell0-db-create-kxgsc" Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.724230 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npr6m\" (UniqueName: \"kubernetes.io/projected/89cd7bb8-e306-4d6d-96a9-33e00ed9f194-kube-api-access-npr6m\") pod \"nova-api-db-create-kwrll\" (UID: \"89cd7bb8-e306-4d6d-96a9-33e00ed9f194\") " pod="openstack/nova-api-db-create-kwrll" Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.724522 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37bc51bf-0822-420e-8d4a-5cb236dd83e4-operator-scripts\") pod \"nova-cell0-db-create-kxgsc\" (UID: \"37bc51bf-0822-420e-8d4a-5cb236dd83e4\") " pod="openstack/nova-cell0-db-create-kxgsc" Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.725149 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89cd7bb8-e306-4d6d-96a9-33e00ed9f194-operator-scripts\") pod \"nova-api-db-create-kwrll\" (UID: \"89cd7bb8-e306-4d6d-96a9-33e00ed9f194\") " pod="openstack/nova-api-db-create-kwrll" Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.759660 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npr6m\" (UniqueName: \"kubernetes.io/projected/89cd7bb8-e306-4d6d-96a9-33e00ed9f194-kube-api-access-npr6m\") pod \"nova-api-db-create-kwrll\" (UID: \"89cd7bb8-e306-4d6d-96a9-33e00ed9f194\") " pod="openstack/nova-api-db-create-kwrll" Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.776843 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0fcd-account-create-update-pkxwm"] Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.778192 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0fcd-account-create-update-pkxwm" Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.781108 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.826071 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37bc51bf-0822-420e-8d4a-5cb236dd83e4-operator-scripts\") pod \"nova-cell0-db-create-kxgsc\" (UID: \"37bc51bf-0822-420e-8d4a-5cb236dd83e4\") " pod="openstack/nova-cell0-db-create-kxgsc" Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.829860 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37bc51bf-0822-420e-8d4a-5cb236dd83e4-operator-scripts\") pod \"nova-cell0-db-create-kxgsc\" (UID: \"37bc51bf-0822-420e-8d4a-5cb236dd83e4\") " pod="openstack/nova-cell0-db-create-kxgsc" Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.830050 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq8mw\" (UniqueName: \"kubernetes.io/projected/37bc51bf-0822-420e-8d4a-5cb236dd83e4-kube-api-access-lq8mw\") pod \"nova-cell0-db-create-kxgsc\" (UID: \"37bc51bf-0822-420e-8d4a-5cb236dd83e4\") " pod="openstack/nova-cell0-db-create-kxgsc" Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.845769 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kwrll" Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.851150 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0fcd-account-create-update-pkxwm"] Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.865887 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-qm4ft"] Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.868297 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qm4ft" Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.887057 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-qm4ft"] Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.933001 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adf1dfe6-d703-43e9-9aca-436d6b37c2e9-operator-scripts\") pod \"nova-api-0fcd-account-create-update-pkxwm\" (UID: \"adf1dfe6-d703-43e9-9aca-436d6b37c2e9\") " pod="openstack/nova-api-0fcd-account-create-update-pkxwm" Dec 01 09:29:56 crc kubenswrapper[4867]: I1201 09:29:56.933222 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n5hl\" (UniqueName: \"kubernetes.io/projected/adf1dfe6-d703-43e9-9aca-436d6b37c2e9-kube-api-access-7n5hl\") pod \"nova-api-0fcd-account-create-update-pkxwm\" (UID: \"adf1dfe6-d703-43e9-9aca-436d6b37c2e9\") " pod="openstack/nova-api-0fcd-account-create-update-pkxwm" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.035531 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adf1dfe6-d703-43e9-9aca-436d6b37c2e9-operator-scripts\") pod \"nova-api-0fcd-account-create-update-pkxwm\" (UID: \"adf1dfe6-d703-43e9-9aca-436d6b37c2e9\") " pod="openstack/nova-api-0fcd-account-create-update-pkxwm" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.035636 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ea5fe34-452d-4805-8bda-47d8f1ab2381-operator-scripts\") pod \"nova-cell1-db-create-qm4ft\" (UID: \"5ea5fe34-452d-4805-8bda-47d8f1ab2381\") " pod="openstack/nova-cell1-db-create-qm4ft" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.035786 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n5hl\" (UniqueName: \"kubernetes.io/projected/adf1dfe6-d703-43e9-9aca-436d6b37c2e9-kube-api-access-7n5hl\") pod \"nova-api-0fcd-account-create-update-pkxwm\" (UID: \"adf1dfe6-d703-43e9-9aca-436d6b37c2e9\") " pod="openstack/nova-api-0fcd-account-create-update-pkxwm" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.035876 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ml4g\" (UniqueName: \"kubernetes.io/projected/5ea5fe34-452d-4805-8bda-47d8f1ab2381-kube-api-access-5ml4g\") pod \"nova-cell1-db-create-qm4ft\" (UID: \"5ea5fe34-452d-4805-8bda-47d8f1ab2381\") " pod="openstack/nova-cell1-db-create-qm4ft" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.036394 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adf1dfe6-d703-43e9-9aca-436d6b37c2e9-operator-scripts\") pod \"nova-api-0fcd-account-create-update-pkxwm\" (UID: \"adf1dfe6-d703-43e9-9aca-436d6b37c2e9\") " pod="openstack/nova-api-0fcd-account-create-update-pkxwm" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.134346 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-8d9d-account-create-update-q8sjd"] Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.135962 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8d9d-account-create-update-q8sjd" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.137534 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ea5fe34-452d-4805-8bda-47d8f1ab2381-operator-scripts\") pod \"nova-cell1-db-create-qm4ft\" (UID: \"5ea5fe34-452d-4805-8bda-47d8f1ab2381\") " pod="openstack/nova-cell1-db-create-qm4ft" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.137748 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ml4g\" (UniqueName: \"kubernetes.io/projected/5ea5fe34-452d-4805-8bda-47d8f1ab2381-kube-api-access-5ml4g\") pod \"nova-cell1-db-create-qm4ft\" (UID: \"5ea5fe34-452d-4805-8bda-47d8f1ab2381\") " pod="openstack/nova-cell1-db-create-qm4ft" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.138900 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ea5fe34-452d-4805-8bda-47d8f1ab2381-operator-scripts\") pod \"nova-cell1-db-create-qm4ft\" (UID: \"5ea5fe34-452d-4805-8bda-47d8f1ab2381\") " pod="openstack/nova-cell1-db-create-qm4ft" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.152067 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8d9d-account-create-update-q8sjd"] Dec 01 09:29:57 crc kubenswrapper[4867]: W1201 09:29:57.164998 4867 reflector.go:561] object-"openstack"/"nova-cell1-db-secret": failed to list *v1.Secret: secrets "nova-cell1-db-secret" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 01 09:29:57 crc kubenswrapper[4867]: E1201 09:29:57.165286 4867 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"nova-cell1-db-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"nova-cell1-db-secret\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.187640 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ml4g\" (UniqueName: \"kubernetes.io/projected/5ea5fe34-452d-4805-8bda-47d8f1ab2381-kube-api-access-5ml4g\") pod \"nova-cell1-db-create-qm4ft\" (UID: \"5ea5fe34-452d-4805-8bda-47d8f1ab2381\") " pod="openstack/nova-cell1-db-create-qm4ft" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.189394 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n5hl\" (UniqueName: \"kubernetes.io/projected/adf1dfe6-d703-43e9-9aca-436d6b37c2e9-kube-api-access-7n5hl\") pod \"nova-api-0fcd-account-create-update-pkxwm\" (UID: \"adf1dfe6-d703-43e9-9aca-436d6b37c2e9\") " pod="openstack/nova-api-0fcd-account-create-update-pkxwm" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.189481 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qm4ft" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.194229 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-c564-account-create-update-h6vqd"] Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.197456 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq8mw\" (UniqueName: \"kubernetes.io/projected/37bc51bf-0822-420e-8d4a-5cb236dd83e4-kube-api-access-lq8mw\") pod \"nova-cell0-db-create-kxgsc\" (UID: \"37bc51bf-0822-420e-8d4a-5cb236dd83e4\") " pod="openstack/nova-cell0-db-create-kxgsc" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.198570 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c564-account-create-update-h6vqd" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.212431 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.221941 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c564-account-create-update-h6vqd"] Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.243356 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8f85ac5-7d71-4b5f-ae85-67b245b18c18-operator-scripts\") pod \"nova-cell1-8d9d-account-create-update-q8sjd\" (UID: \"a8f85ac5-7d71-4b5f-ae85-67b245b18c18\") " pod="openstack/nova-cell1-8d9d-account-create-update-q8sjd" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.243541 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr9fl\" (UniqueName: \"kubernetes.io/projected/a8f85ac5-7d71-4b5f-ae85-67b245b18c18-kube-api-access-gr9fl\") pod \"nova-cell1-8d9d-account-create-update-q8sjd\" (UID: \"a8f85ac5-7d71-4b5f-ae85-67b245b18c18\") " pod="openstack/nova-cell1-8d9d-account-create-update-q8sjd" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.246850 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kxgsc" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.344979 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w7rr\" (UniqueName: \"kubernetes.io/projected/3e5461b6-70c3-4b0a-aea5-827baa9fc665-kube-api-access-7w7rr\") pod \"nova-cell0-c564-account-create-update-h6vqd\" (UID: \"3e5461b6-70c3-4b0a-aea5-827baa9fc665\") " pod="openstack/nova-cell0-c564-account-create-update-h6vqd" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.345148 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr9fl\" (UniqueName: \"kubernetes.io/projected/a8f85ac5-7d71-4b5f-ae85-67b245b18c18-kube-api-access-gr9fl\") pod \"nova-cell1-8d9d-account-create-update-q8sjd\" (UID: \"a8f85ac5-7d71-4b5f-ae85-67b245b18c18\") " pod="openstack/nova-cell1-8d9d-account-create-update-q8sjd" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.345415 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e5461b6-70c3-4b0a-aea5-827baa9fc665-operator-scripts\") pod \"nova-cell0-c564-account-create-update-h6vqd\" (UID: \"3e5461b6-70c3-4b0a-aea5-827baa9fc665\") " pod="openstack/nova-cell0-c564-account-create-update-h6vqd" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.345495 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8f85ac5-7d71-4b5f-ae85-67b245b18c18-operator-scripts\") pod \"nova-cell1-8d9d-account-create-update-q8sjd\" (UID: \"a8f85ac5-7d71-4b5f-ae85-67b245b18c18\") " pod="openstack/nova-cell1-8d9d-account-create-update-q8sjd" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.346358 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8f85ac5-7d71-4b5f-ae85-67b245b18c18-operator-scripts\") pod \"nova-cell1-8d9d-account-create-update-q8sjd\" (UID: \"a8f85ac5-7d71-4b5f-ae85-67b245b18c18\") " pod="openstack/nova-cell1-8d9d-account-create-update-q8sjd" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.366713 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr9fl\" (UniqueName: \"kubernetes.io/projected/a8f85ac5-7d71-4b5f-ae85-67b245b18c18-kube-api-access-gr9fl\") pod \"nova-cell1-8d9d-account-create-update-q8sjd\" (UID: \"a8f85ac5-7d71-4b5f-ae85-67b245b18c18\") " pod="openstack/nova-cell1-8d9d-account-create-update-q8sjd" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.442863 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0fcd-account-create-update-pkxwm" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.448517 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e5461b6-70c3-4b0a-aea5-827baa9fc665-operator-scripts\") pod \"nova-cell0-c564-account-create-update-h6vqd\" (UID: \"3e5461b6-70c3-4b0a-aea5-827baa9fc665\") " pod="openstack/nova-cell0-c564-account-create-update-h6vqd" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.449353 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w7rr\" (UniqueName: \"kubernetes.io/projected/3e5461b6-70c3-4b0a-aea5-827baa9fc665-kube-api-access-7w7rr\") pod \"nova-cell0-c564-account-create-update-h6vqd\" (UID: \"3e5461b6-70c3-4b0a-aea5-827baa9fc665\") " pod="openstack/nova-cell0-c564-account-create-update-h6vqd" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.450219 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e5461b6-70c3-4b0a-aea5-827baa9fc665-operator-scripts\") pod \"nova-cell0-c564-account-create-update-h6vqd\" (UID: \"3e5461b6-70c3-4b0a-aea5-827baa9fc665\") " pod="openstack/nova-cell0-c564-account-create-update-h6vqd" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.463889 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8d9d-account-create-update-q8sjd" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.468475 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w7rr\" (UniqueName: \"kubernetes.io/projected/3e5461b6-70c3-4b0a-aea5-827baa9fc665-kube-api-access-7w7rr\") pod \"nova-cell0-c564-account-create-update-h6vqd\" (UID: \"3e5461b6-70c3-4b0a-aea5-827baa9fc665\") " pod="openstack/nova-cell0-c564-account-create-update-h6vqd" Dec 01 09:29:57 crc kubenswrapper[4867]: I1201 09:29:57.615076 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c564-account-create-update-h6vqd" Dec 01 09:29:58 crc kubenswrapper[4867]: I1201 09:29:58.000603 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 01 09:30:00 crc kubenswrapper[4867]: I1201 09:30:00.141511 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z"] Dec 01 09:30:00 crc kubenswrapper[4867]: I1201 09:30:00.143186 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z" Dec 01 09:30:00 crc kubenswrapper[4867]: I1201 09:30:00.155976 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 09:30:00 crc kubenswrapper[4867]: I1201 09:30:00.156509 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 09:30:00 crc kubenswrapper[4867]: I1201 09:30:00.171681 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z"] Dec 01 09:30:00 crc kubenswrapper[4867]: I1201 09:30:00.190935 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b895e470-c6f1-4072-b66d-b6c3cb1791e8-config-volume\") pod \"collect-profiles-29409690-5q42z\" (UID: \"b895e470-c6f1-4072-b66d-b6c3cb1791e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z" Dec 01 09:30:00 crc kubenswrapper[4867]: I1201 09:30:00.191205 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b895e470-c6f1-4072-b66d-b6c3cb1791e8-secret-volume\") pod \"collect-profiles-29409690-5q42z\" (UID: \"b895e470-c6f1-4072-b66d-b6c3cb1791e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z" Dec 01 09:30:00 crc kubenswrapper[4867]: I1201 09:30:00.191452 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgrxp\" (UniqueName: \"kubernetes.io/projected/b895e470-c6f1-4072-b66d-b6c3cb1791e8-kube-api-access-zgrxp\") pod \"collect-profiles-29409690-5q42z\" (UID: \"b895e470-c6f1-4072-b66d-b6c3cb1791e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z" Dec 01 09:30:00 crc kubenswrapper[4867]: I1201 09:30:00.292658 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b895e470-c6f1-4072-b66d-b6c3cb1791e8-secret-volume\") pod \"collect-profiles-29409690-5q42z\" (UID: \"b895e470-c6f1-4072-b66d-b6c3cb1791e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z" Dec 01 09:30:00 crc kubenswrapper[4867]: I1201 09:30:00.292737 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgrxp\" (UniqueName: \"kubernetes.io/projected/b895e470-c6f1-4072-b66d-b6c3cb1791e8-kube-api-access-zgrxp\") pod \"collect-profiles-29409690-5q42z\" (UID: \"b895e470-c6f1-4072-b66d-b6c3cb1791e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z" Dec 01 09:30:00 crc kubenswrapper[4867]: I1201 09:30:00.292874 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b895e470-c6f1-4072-b66d-b6c3cb1791e8-config-volume\") pod \"collect-profiles-29409690-5q42z\" (UID: \"b895e470-c6f1-4072-b66d-b6c3cb1791e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z" Dec 01 09:30:00 crc kubenswrapper[4867]: I1201 09:30:00.293677 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b895e470-c6f1-4072-b66d-b6c3cb1791e8-config-volume\") pod \"collect-profiles-29409690-5q42z\" (UID: \"b895e470-c6f1-4072-b66d-b6c3cb1791e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z" Dec 01 09:30:00 crc kubenswrapper[4867]: I1201 09:30:00.298166 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b895e470-c6f1-4072-b66d-b6c3cb1791e8-secret-volume\") pod \"collect-profiles-29409690-5q42z\" (UID: \"b895e470-c6f1-4072-b66d-b6c3cb1791e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z" Dec 01 09:30:00 crc kubenswrapper[4867]: I1201 09:30:00.313185 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgrxp\" (UniqueName: \"kubernetes.io/projected/b895e470-c6f1-4072-b66d-b6c3cb1791e8-kube-api-access-zgrxp\") pod \"collect-profiles-29409690-5q42z\" (UID: \"b895e470-c6f1-4072-b66d-b6c3cb1791e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z" Dec 01 09:30:00 crc kubenswrapper[4867]: I1201 09:30:00.492430 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.086786 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.114608 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-config-data\") pod \"318a4560-9e79-46fe-96bf-aaa534848b45\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.114652 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/318a4560-9e79-46fe-96bf-aaa534848b45-log-httpd\") pod \"318a4560-9e79-46fe-96bf-aaa534848b45\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.114676 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-sg-core-conf-yaml\") pod \"318a4560-9e79-46fe-96bf-aaa534848b45\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.114764 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/318a4560-9e79-46fe-96bf-aaa534848b45-run-httpd\") pod \"318a4560-9e79-46fe-96bf-aaa534848b45\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.114801 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-scripts\") pod \"318a4560-9e79-46fe-96bf-aaa534848b45\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.120762 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8bdt\" (UniqueName: \"kubernetes.io/projected/318a4560-9e79-46fe-96bf-aaa534848b45-kube-api-access-j8bdt\") pod \"318a4560-9e79-46fe-96bf-aaa534848b45\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.120894 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-combined-ca-bundle\") pod \"318a4560-9e79-46fe-96bf-aaa534848b45\" (UID: \"318a4560-9e79-46fe-96bf-aaa534848b45\") " Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.116241 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/318a4560-9e79-46fe-96bf-aaa534848b45-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "318a4560-9e79-46fe-96bf-aaa534848b45" (UID: "318a4560-9e79-46fe-96bf-aaa534848b45"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.116470 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/318a4560-9e79-46fe-96bf-aaa534848b45-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "318a4560-9e79-46fe-96bf-aaa534848b45" (UID: "318a4560-9e79-46fe-96bf-aaa534848b45"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.123271 4867 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/318a4560-9e79-46fe-96bf-aaa534848b45-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.123292 4867 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/318a4560-9e79-46fe-96bf-aaa534848b45-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.197092 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-scripts" (OuterVolumeSpecName: "scripts") pod "318a4560-9e79-46fe-96bf-aaa534848b45" (UID: "318a4560-9e79-46fe-96bf-aaa534848b45"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.213952 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/318a4560-9e79-46fe-96bf-aaa534848b45-kube-api-access-j8bdt" (OuterVolumeSpecName: "kube-api-access-j8bdt") pod "318a4560-9e79-46fe-96bf-aaa534848b45" (UID: "318a4560-9e79-46fe-96bf-aaa534848b45"). InnerVolumeSpecName "kube-api-access-j8bdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.220107 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "318a4560-9e79-46fe-96bf-aaa534848b45" (UID: "318a4560-9e79-46fe-96bf-aaa534848b45"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.229480 4867 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.230020 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.230102 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8bdt\" (UniqueName: \"kubernetes.io/projected/318a4560-9e79-46fe-96bf-aaa534848b45-kube-api-access-j8bdt\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.232295 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8d9d-account-create-update-q8sjd"] Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.282667 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.478027 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-qm4ft"] Dec 01 09:30:01 crc kubenswrapper[4867]: W1201 09:30:01.489449 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ea5fe34_452d_4805_8bda_47d8f1ab2381.slice/crio-b4e2c80664788073d3f15d79b46d62fcd4edcce3921474e349ab6497c4bbe861 WatchSource:0}: Error finding container b4e2c80664788073d3f15d79b46d62fcd4edcce3921474e349ab6497c4bbe861: Status 404 returned error can't find the container with id b4e2c80664788073d3f15d79b46d62fcd4edcce3921474e349ab6497c4bbe861 Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.500477 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9dcc6b98f-chkvz" event={"ID":"476caa3a-28ba-471d-b4c0-c263c5960a87","Type":"ContainerStarted","Data":"afb724174be456e9b11cdba3c7be927e7955507257e477b94098fcb493bf78d9"} Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.515350 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"318a4560-9e79-46fe-96bf-aaa534848b45","Type":"ContainerDied","Data":"22ea4e93b15affb07f1afa78412ecddc47450ea74f5f616f67a4f1e5e0f8f5c7"} Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.515402 4867 scope.go:117] "RemoveContainer" containerID="d53a66d16cfe9b2aaf0a818646fdad9bf43c42fb7929ee3c443e40dde294ee35" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.515523 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.531726 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c846795f4-k7mlj" event={"ID":"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e","Type":"ContainerStarted","Data":"c7ec8779b58f97fafd5134c7e65888047b97ae6e37afffbfa008a76f648c7186"} Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.537639 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8d9d-account-create-update-q8sjd" event={"ID":"a8f85ac5-7d71-4b5f-ae85-67b245b18c18","Type":"ContainerStarted","Data":"a8fecc296a053199874da7cbd9d076913ceffe864ed25a3a345700f6e55ffd47"} Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.623044 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "318a4560-9e79-46fe-96bf-aaa534848b45" (UID: "318a4560-9e79-46fe-96bf-aaa534848b45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.693301 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.729990 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-kxgsc"] Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.744380 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0fcd-account-create-update-pkxwm"] Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.751065 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.792949 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-config-data" (OuterVolumeSpecName: "config-data") pod "318a4560-9e79-46fe-96bf-aaa534848b45" (UID: "318a4560-9e79-46fe-96bf-aaa534848b45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.795206 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/318a4560-9e79-46fe-96bf-aaa534848b45-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.912676 4867 scope.go:117] "RemoveContainer" containerID="b2561a8a1932da1b4d94559b1824ad5e60ff376b1697831aa0d6039cb12737f0" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.971401 4867 scope.go:117] "RemoveContainer" containerID="e77cb88d4067f405f4fafdeb06849446216517878074e66a98bd78d9f3b471ba" Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.971462 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:01 crc kubenswrapper[4867]: I1201 09:30:01.998943 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.009543 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:02 crc kubenswrapper[4867]: E1201 09:30:02.010240 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318a4560-9e79-46fe-96bf-aaa534848b45" containerName="ceilometer-notification-agent" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.010272 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="318a4560-9e79-46fe-96bf-aaa534848b45" containerName="ceilometer-notification-agent" Dec 01 09:30:02 crc kubenswrapper[4867]: E1201 09:30:02.010291 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318a4560-9e79-46fe-96bf-aaa534848b45" containerName="sg-core" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.010299 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="318a4560-9e79-46fe-96bf-aaa534848b45" containerName="sg-core" Dec 01 09:30:02 crc kubenswrapper[4867]: E1201 09:30:02.010360 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318a4560-9e79-46fe-96bf-aaa534848b45" containerName="proxy-httpd" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.010368 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="318a4560-9e79-46fe-96bf-aaa534848b45" containerName="proxy-httpd" Dec 01 09:30:02 crc kubenswrapper[4867]: E1201 09:30:02.010389 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318a4560-9e79-46fe-96bf-aaa534848b45" containerName="ceilometer-central-agent" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.010397 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="318a4560-9e79-46fe-96bf-aaa534848b45" containerName="ceilometer-central-agent" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.010603 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="318a4560-9e79-46fe-96bf-aaa534848b45" containerName="sg-core" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.010626 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="318a4560-9e79-46fe-96bf-aaa534848b45" containerName="ceilometer-central-agent" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.010640 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="318a4560-9e79-46fe-96bf-aaa534848b45" containerName="ceilometer-notification-agent" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.010684 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="318a4560-9e79-46fe-96bf-aaa534848b45" containerName="proxy-httpd" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.012794 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.018888 4867 scope.go:117] "RemoveContainer" containerID="955ae7214111cfad62bad5b8d44104f863a69864fcd2ddd389e921e3b8c7e03f" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.019112 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.019218 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.039729 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.108560 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.108634 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2eafa378-9a4a-4028-a0cb-fbb55fa15972-run-httpd\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.108665 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-config-data\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.108706 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.108866 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r69lr\" (UniqueName: \"kubernetes.io/projected/2eafa378-9a4a-4028-a0cb-fbb55fa15972-kube-api-access-r69lr\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.108919 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-scripts\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.108948 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2eafa378-9a4a-4028-a0cb-fbb55fa15972-log-httpd\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.210165 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2eafa378-9a4a-4028-a0cb-fbb55fa15972-log-httpd\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.210624 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.210661 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2eafa378-9a4a-4028-a0cb-fbb55fa15972-run-httpd\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.210683 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-config-data\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.210704 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.210874 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r69lr\" (UniqueName: \"kubernetes.io/projected/2eafa378-9a4a-4028-a0cb-fbb55fa15972-kube-api-access-r69lr\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.210908 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-scripts\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.212196 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2eafa378-9a4a-4028-a0cb-fbb55fa15972-run-httpd\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.212362 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2eafa378-9a4a-4028-a0cb-fbb55fa15972-log-httpd\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.217169 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-scripts\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.217897 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.218267 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.224001 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-config-data\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.269647 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r69lr\" (UniqueName: \"kubernetes.io/projected/2eafa378-9a4a-4028-a0cb-fbb55fa15972-kube-api-access-r69lr\") pod \"ceilometer-0\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.359311 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.595434 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d47c7cb76-srf4p" event={"ID":"8bd4fac2-df2c-4aab-bf00-99b54a83ddca","Type":"ContainerStarted","Data":"1ea30f76a6e3c0a162d6bb9415d525b9c47c036354111f0ab5aa32652d0af895"} Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.612448 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kxgsc" event={"ID":"37bc51bf-0822-420e-8d4a-5cb236dd83e4","Type":"ContainerStarted","Data":"d01426db197b797b1ac09d6d10e8ed0d56e3cae22cdf76a5a5b05377d005d206"} Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.623497 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qm4ft" event={"ID":"5ea5fe34-452d-4805-8bda-47d8f1ab2381","Type":"ContainerStarted","Data":"b4e2c80664788073d3f15d79b46d62fcd4edcce3921474e349ab6497c4bbe861"} Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.634569 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0fcd-account-create-update-pkxwm" event={"ID":"adf1dfe6-d703-43e9-9aca-436d6b37c2e9","Type":"ContainerStarted","Data":"ddea6a1a9fc43e276242b7c9139c776af91c1ba356f39ae1b4737174267102f5"} Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.838446 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="318a4560-9e79-46fe-96bf-aaa534848b45" path="/var/lib/kubelet/pods/318a4560-9e79-46fe-96bf-aaa534848b45/volumes" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.905843 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:30:02 crc kubenswrapper[4867]: I1201 09:30:02.905891 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.050162 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z"] Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.089771 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-kwrll"] Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.116435 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.118058 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c564-account-create-update-h6vqd"] Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.601118 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:03 crc kubenswrapper[4867]: W1201 09:30:03.618586 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2eafa378_9a4a_4028_a0cb_fbb55fa15972.slice/crio-3129a4d600de30abf4a3db2ab620900f2dff57b3661f9fd110a492533b87d3f9 WatchSource:0}: Error finding container 3129a4d600de30abf4a3db2ab620900f2dff57b3661f9fd110a492533b87d3f9: Status 404 returned error can't find the container with id 3129a4d600de30abf4a3db2ab620900f2dff57b3661f9fd110a492533b87d3f9 Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.651502 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2eafa378-9a4a-4028-a0cb-fbb55fa15972","Type":"ContainerStarted","Data":"3129a4d600de30abf4a3db2ab620900f2dff57b3661f9fd110a492533b87d3f9"} Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.653479 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8d9d-account-create-update-q8sjd" event={"ID":"a8f85ac5-7d71-4b5f-ae85-67b245b18c18","Type":"ContainerStarted","Data":"b44a345ee62e34ad61416dcfff0a0fa9014134312601d92b46bc5a5faa525f2f"} Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.657210 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kxgsc" event={"ID":"37bc51bf-0822-420e-8d4a-5cb236dd83e4","Type":"ContainerStarted","Data":"e30b2dd307779de6d0c69d5daa297da446630ce11e847904b81d1e9857a6e808"} Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.659051 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qm4ft" event={"ID":"5ea5fe34-452d-4805-8bda-47d8f1ab2381","Type":"ContainerStarted","Data":"17574e05985b2b4766ede465929a62c0017d755e96dc17eb9460add4f9de65cf"} Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.662251 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0fcd-account-create-update-pkxwm" event={"ID":"adf1dfe6-d703-43e9-9aca-436d6b37c2e9","Type":"ContainerStarted","Data":"e34023d4caf0c4f8960e4d119265440dcb0a7191944d841d1b3c3cf1054cf5b6"} Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.668374 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z" event={"ID":"b895e470-c6f1-4072-b66d-b6c3cb1791e8","Type":"ContainerStarted","Data":"a9439d6639712462f18577de1fc5e27800d26f9b7b03a357f2c8c6b1fd540f21"} Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.672859 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"e2fbdcd3-0c11-4681-99af-c9b4fb717637","Type":"ContainerStarted","Data":"a47f8efea1771ac3a1f29e0ff42b0366d79f221b6e7a6d066ec972bd43c08d9b"} Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.681336 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9dcc6b98f-chkvz" event={"ID":"476caa3a-28ba-471d-b4c0-c263c5960a87","Type":"ContainerStarted","Data":"36608b81063321ed903aaab757fd9194e1b979fc0ff2887def33279a9bde120b"} Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.681874 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.681918 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.689478 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c564-account-create-update-h6vqd" event={"ID":"3e5461b6-70c3-4b0a-aea5-827baa9fc665","Type":"ContainerStarted","Data":"f4054d837c16ee98f9d9b7456b1536790ce64c8dd3d31494697026330d0772ff"} Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.698828 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kwrll" event={"ID":"89cd7bb8-e306-4d6d-96a9-33e00ed9f194","Type":"ContainerStarted","Data":"8d969fbf16b5d8d1a017f4d9053fa2c4dce1886983a4f23abaee834930d3ec07"} Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.701967 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-8d9d-account-create-update-q8sjd" podStartSLOduration=6.701942406 podStartE2EDuration="6.701942406s" podCreationTimestamp="2025-12-01 09:29:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:30:03.674083173 +0000 UTC m=+1325.133469927" watchObservedRunningTime="2025-12-01 09:30:03.701942406 +0000 UTC m=+1325.161329160" Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.714626 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=5.354178223 podStartE2EDuration="21.714609763s" podCreationTimestamp="2025-12-01 09:29:42 +0000 UTC" firstStartedPulling="2025-12-01 09:29:44.31867127 +0000 UTC m=+1305.778058034" lastFinishedPulling="2025-12-01 09:30:00.67910282 +0000 UTC m=+1322.138489574" observedRunningTime="2025-12-01 09:30:03.702394379 +0000 UTC m=+1325.161781133" watchObservedRunningTime="2025-12-01 09:30:03.714609763 +0000 UTC m=+1325.173996517" Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.735132 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0fcd-account-create-update-pkxwm" podStartSLOduration=7.735109814 podStartE2EDuration="7.735109814s" podCreationTimestamp="2025-12-01 09:29:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:30:03.730003204 +0000 UTC m=+1325.189389958" watchObservedRunningTime="2025-12-01 09:30:03.735109814 +0000 UTC m=+1325.194496568" Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.764484 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-qm4ft" podStartSLOduration=7.764463468 podStartE2EDuration="7.764463468s" podCreationTimestamp="2025-12-01 09:29:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:30:03.761847806 +0000 UTC m=+1325.221234570" watchObservedRunningTime="2025-12-01 09:30:03.764463468 +0000 UTC m=+1325.223850222" Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.799618 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-kxgsc" podStartSLOduration=7.799577899 podStartE2EDuration="7.799577899s" podCreationTimestamp="2025-12-01 09:29:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:30:03.773991698 +0000 UTC m=+1325.233378452" watchObservedRunningTime="2025-12-01 09:30:03.799577899 +0000 UTC m=+1325.258964653" Dec 01 09:30:03 crc kubenswrapper[4867]: I1201 09:30:03.819699 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-9dcc6b98f-chkvz" podStartSLOduration=13.819669869 podStartE2EDuration="13.819669869s" podCreationTimestamp="2025-12-01 09:29:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:30:03.803941079 +0000 UTC m=+1325.263327843" watchObservedRunningTime="2025-12-01 09:30:03.819669869 +0000 UTC m=+1325.279056653" Dec 01 09:30:04 crc kubenswrapper[4867]: I1201 09:30:04.707907 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z" event={"ID":"b895e470-c6f1-4072-b66d-b6c3cb1791e8","Type":"ContainerStarted","Data":"67f71ca5c1f144414e072478412624ac6438ededc97a3353c0696e206d24dfe1"} Dec 01 09:30:04 crc kubenswrapper[4867]: I1201 09:30:04.709683 4867 generic.go:334] "Generic (PLEG): container finished" podID="37bc51bf-0822-420e-8d4a-5cb236dd83e4" containerID="e30b2dd307779de6d0c69d5daa297da446630ce11e847904b81d1e9857a6e808" exitCode=0 Dec 01 09:30:04 crc kubenswrapper[4867]: I1201 09:30:04.709823 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kxgsc" event={"ID":"37bc51bf-0822-420e-8d4a-5cb236dd83e4","Type":"ContainerDied","Data":"e30b2dd307779de6d0c69d5daa297da446630ce11e847904b81d1e9857a6e808"} Dec 01 09:30:04 crc kubenswrapper[4867]: I1201 09:30:04.711953 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c564-account-create-update-h6vqd" event={"ID":"3e5461b6-70c3-4b0a-aea5-827baa9fc665","Type":"ContainerStarted","Data":"8380c6624ae4ca667b01d7fcfec3f1549121f09e65da55860522386cd182e428"} Dec 01 09:30:04 crc kubenswrapper[4867]: I1201 09:30:04.713462 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kwrll" event={"ID":"89cd7bb8-e306-4d6d-96a9-33e00ed9f194","Type":"ContainerStarted","Data":"c4b319bdd9890133e51e8a855985e834ffdc22314a0bfd4357c58ac42b61dd7f"} Dec 01 09:30:04 crc kubenswrapper[4867]: I1201 09:30:04.715006 4867 generic.go:334] "Generic (PLEG): container finished" podID="5ea5fe34-452d-4805-8bda-47d8f1ab2381" containerID="17574e05985b2b4766ede465929a62c0017d755e96dc17eb9460add4f9de65cf" exitCode=0 Dec 01 09:30:04 crc kubenswrapper[4867]: I1201 09:30:04.715056 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qm4ft" event={"ID":"5ea5fe34-452d-4805-8bda-47d8f1ab2381","Type":"ContainerDied","Data":"17574e05985b2b4766ede465929a62c0017d755e96dc17eb9460add4f9de65cf"} Dec 01 09:30:04 crc kubenswrapper[4867]: I1201 09:30:04.717211 4867 generic.go:334] "Generic (PLEG): container finished" podID="adf1dfe6-d703-43e9-9aca-436d6b37c2e9" containerID="e34023d4caf0c4f8960e4d119265440dcb0a7191944d841d1b3c3cf1054cf5b6" exitCode=0 Dec 01 09:30:04 crc kubenswrapper[4867]: I1201 09:30:04.717408 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0fcd-account-create-update-pkxwm" event={"ID":"adf1dfe6-d703-43e9-9aca-436d6b37c2e9","Type":"ContainerDied","Data":"e34023d4caf0c4f8960e4d119265440dcb0a7191944d841d1b3c3cf1054cf5b6"} Dec 01 09:30:04 crc kubenswrapper[4867]: I1201 09:30:04.733005 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z" podStartSLOduration=4.732984389 podStartE2EDuration="4.732984389s" podCreationTimestamp="2025-12-01 09:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:30:04.726139082 +0000 UTC m=+1326.185525836" watchObservedRunningTime="2025-12-01 09:30:04.732984389 +0000 UTC m=+1326.192371143" Dec 01 09:30:04 crc kubenswrapper[4867]: I1201 09:30:04.774624 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-kwrll" podStartSLOduration=8.774599248 podStartE2EDuration="8.774599248s" podCreationTimestamp="2025-12-01 09:29:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:30:04.765213642 +0000 UTC m=+1326.224600406" watchObservedRunningTime="2025-12-01 09:30:04.774599248 +0000 UTC m=+1326.233986002" Dec 01 09:30:04 crc kubenswrapper[4867]: I1201 09:30:04.820961 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-c564-account-create-update-h6vqd" podStartSLOduration=7.820943088 podStartE2EDuration="7.820943088s" podCreationTimestamp="2025-12-01 09:29:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:30:04.812457415 +0000 UTC m=+1326.271844179" watchObservedRunningTime="2025-12-01 09:30:04.820943088 +0000 UTC m=+1326.280329842" Dec 01 09:30:05 crc kubenswrapper[4867]: I1201 09:30:05.726862 4867 generic.go:334] "Generic (PLEG): container finished" podID="b895e470-c6f1-4072-b66d-b6c3cb1791e8" containerID="67f71ca5c1f144414e072478412624ac6438ededc97a3353c0696e206d24dfe1" exitCode=0 Dec 01 09:30:05 crc kubenswrapper[4867]: I1201 09:30:05.726922 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z" event={"ID":"b895e470-c6f1-4072-b66d-b6c3cb1791e8","Type":"ContainerDied","Data":"67f71ca5c1f144414e072478412624ac6438ededc97a3353c0696e206d24dfe1"} Dec 01 09:30:05 crc kubenswrapper[4867]: I1201 09:30:05.728421 4867 generic.go:334] "Generic (PLEG): container finished" podID="a8f85ac5-7d71-4b5f-ae85-67b245b18c18" containerID="b44a345ee62e34ad61416dcfff0a0fa9014134312601d92b46bc5a5faa525f2f" exitCode=0 Dec 01 09:30:05 crc kubenswrapper[4867]: I1201 09:30:05.728545 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8d9d-account-create-update-q8sjd" event={"ID":"a8f85ac5-7d71-4b5f-ae85-67b245b18c18","Type":"ContainerDied","Data":"b44a345ee62e34ad61416dcfff0a0fa9014134312601d92b46bc5a5faa525f2f"} Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.113642 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qm4ft" Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.306406 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ea5fe34-452d-4805-8bda-47d8f1ab2381-operator-scripts\") pod \"5ea5fe34-452d-4805-8bda-47d8f1ab2381\" (UID: \"5ea5fe34-452d-4805-8bda-47d8f1ab2381\") " Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.306922 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ml4g\" (UniqueName: \"kubernetes.io/projected/5ea5fe34-452d-4805-8bda-47d8f1ab2381-kube-api-access-5ml4g\") pod \"5ea5fe34-452d-4805-8bda-47d8f1ab2381\" (UID: \"5ea5fe34-452d-4805-8bda-47d8f1ab2381\") " Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.307246 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ea5fe34-452d-4805-8bda-47d8f1ab2381-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5ea5fe34-452d-4805-8bda-47d8f1ab2381" (UID: "5ea5fe34-452d-4805-8bda-47d8f1ab2381"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.307717 4867 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ea5fe34-452d-4805-8bda-47d8f1ab2381-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.312493 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ea5fe34-452d-4805-8bda-47d8f1ab2381-kube-api-access-5ml4g" (OuterVolumeSpecName: "kube-api-access-5ml4g") pod "5ea5fe34-452d-4805-8bda-47d8f1ab2381" (UID: "5ea5fe34-452d-4805-8bda-47d8f1ab2381"). InnerVolumeSpecName "kube-api-access-5ml4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.410206 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ml4g\" (UniqueName: \"kubernetes.io/projected/5ea5fe34-452d-4805-8bda-47d8f1ab2381-kube-api-access-5ml4g\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.471794 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0fcd-account-create-update-pkxwm" Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.516018 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adf1dfe6-d703-43e9-9aca-436d6b37c2e9-operator-scripts\") pod \"adf1dfe6-d703-43e9-9aca-436d6b37c2e9\" (UID: \"adf1dfe6-d703-43e9-9aca-436d6b37c2e9\") " Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.516091 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n5hl\" (UniqueName: \"kubernetes.io/projected/adf1dfe6-d703-43e9-9aca-436d6b37c2e9-kube-api-access-7n5hl\") pod \"adf1dfe6-d703-43e9-9aca-436d6b37c2e9\" (UID: \"adf1dfe6-d703-43e9-9aca-436d6b37c2e9\") " Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.525088 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adf1dfe6-d703-43e9-9aca-436d6b37c2e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "adf1dfe6-d703-43e9-9aca-436d6b37c2e9" (UID: "adf1dfe6-d703-43e9-9aca-436d6b37c2e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.527492 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adf1dfe6-d703-43e9-9aca-436d6b37c2e9-kube-api-access-7n5hl" (OuterVolumeSpecName: "kube-api-access-7n5hl") pod "adf1dfe6-d703-43e9-9aca-436d6b37c2e9" (UID: "adf1dfe6-d703-43e9-9aca-436d6b37c2e9"). InnerVolumeSpecName "kube-api-access-7n5hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.619118 4867 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adf1dfe6-d703-43e9-9aca-436d6b37c2e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.619159 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n5hl\" (UniqueName: \"kubernetes.io/projected/adf1dfe6-d703-43e9-9aca-436d6b37c2e9-kube-api-access-7n5hl\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.684980 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kxgsc" Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.765622 4867 generic.go:334] "Generic (PLEG): container finished" podID="3e5461b6-70c3-4b0a-aea5-827baa9fc665" containerID="8380c6624ae4ca667b01d7fcfec3f1549121f09e65da55860522386cd182e428" exitCode=0 Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.765693 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c564-account-create-update-h6vqd" event={"ID":"3e5461b6-70c3-4b0a-aea5-827baa9fc665","Type":"ContainerDied","Data":"8380c6624ae4ca667b01d7fcfec3f1549121f09e65da55860522386cd182e428"} Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.776060 4867 generic.go:334] "Generic (PLEG): container finished" podID="89cd7bb8-e306-4d6d-96a9-33e00ed9f194" containerID="c4b319bdd9890133e51e8a855985e834ffdc22314a0bfd4357c58ac42b61dd7f" exitCode=0 Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.776149 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kwrll" event={"ID":"89cd7bb8-e306-4d6d-96a9-33e00ed9f194","Type":"ContainerDied","Data":"c4b319bdd9890133e51e8a855985e834ffdc22314a0bfd4357c58ac42b61dd7f"} Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.823859 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq8mw\" (UniqueName: \"kubernetes.io/projected/37bc51bf-0822-420e-8d4a-5cb236dd83e4-kube-api-access-lq8mw\") pod \"37bc51bf-0822-420e-8d4a-5cb236dd83e4\" (UID: \"37bc51bf-0822-420e-8d4a-5cb236dd83e4\") " Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.823941 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37bc51bf-0822-420e-8d4a-5cb236dd83e4-operator-scripts\") pod \"37bc51bf-0822-420e-8d4a-5cb236dd83e4\" (UID: \"37bc51bf-0822-420e-8d4a-5cb236dd83e4\") " Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.828406 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37bc51bf-0822-420e-8d4a-5cb236dd83e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37bc51bf-0822-420e-8d4a-5cb236dd83e4" (UID: "37bc51bf-0822-420e-8d4a-5cb236dd83e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.837188 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qm4ft" Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.855017 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37bc51bf-0822-420e-8d4a-5cb236dd83e4-kube-api-access-lq8mw" (OuterVolumeSpecName: "kube-api-access-lq8mw") pod "37bc51bf-0822-420e-8d4a-5cb236dd83e4" (UID: "37bc51bf-0822-420e-8d4a-5cb236dd83e4"). InnerVolumeSpecName "kube-api-access-lq8mw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.902328 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0fcd-account-create-update-pkxwm" Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.919292 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kxgsc" Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.924162 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qm4ft" event={"ID":"5ea5fe34-452d-4805-8bda-47d8f1ab2381","Type":"ContainerDied","Data":"b4e2c80664788073d3f15d79b46d62fcd4edcce3921474e349ab6497c4bbe861"} Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.924204 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4e2c80664788073d3f15d79b46d62fcd4edcce3921474e349ab6497c4bbe861" Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.924221 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0fcd-account-create-update-pkxwm" event={"ID":"adf1dfe6-d703-43e9-9aca-436d6b37c2e9","Type":"ContainerDied","Data":"ddea6a1a9fc43e276242b7c9139c776af91c1ba356f39ae1b4737174267102f5"} Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.924231 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddea6a1a9fc43e276242b7c9139c776af91c1ba356f39ae1b4737174267102f5" Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.924239 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kxgsc" event={"ID":"37bc51bf-0822-420e-8d4a-5cb236dd83e4","Type":"ContainerDied","Data":"d01426db197b797b1ac09d6d10e8ed0d56e3cae22cdf76a5a5b05377d005d206"} Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.924249 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d01426db197b797b1ac09d6d10e8ed0d56e3cae22cdf76a5a5b05377d005d206" Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.933909 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq8mw\" (UniqueName: \"kubernetes.io/projected/37bc51bf-0822-420e-8d4a-5cb236dd83e4-kube-api-access-lq8mw\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:06 crc kubenswrapper[4867]: I1201 09:30:06.933969 4867 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37bc51bf-0822-420e-8d4a-5cb236dd83e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.412826 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z" Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.545514 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b895e470-c6f1-4072-b66d-b6c3cb1791e8-secret-volume\") pod \"b895e470-c6f1-4072-b66d-b6c3cb1791e8\" (UID: \"b895e470-c6f1-4072-b66d-b6c3cb1791e8\") " Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.546008 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b895e470-c6f1-4072-b66d-b6c3cb1791e8-config-volume\") pod \"b895e470-c6f1-4072-b66d-b6c3cb1791e8\" (UID: \"b895e470-c6f1-4072-b66d-b6c3cb1791e8\") " Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.546099 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgrxp\" (UniqueName: \"kubernetes.io/projected/b895e470-c6f1-4072-b66d-b6c3cb1791e8-kube-api-access-zgrxp\") pod \"b895e470-c6f1-4072-b66d-b6c3cb1791e8\" (UID: \"b895e470-c6f1-4072-b66d-b6c3cb1791e8\") " Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.547229 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b895e470-c6f1-4072-b66d-b6c3cb1791e8-config-volume" (OuterVolumeSpecName: "config-volume") pod "b895e470-c6f1-4072-b66d-b6c3cb1791e8" (UID: "b895e470-c6f1-4072-b66d-b6c3cb1791e8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.552693 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b895e470-c6f1-4072-b66d-b6c3cb1791e8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b895e470-c6f1-4072-b66d-b6c3cb1791e8" (UID: "b895e470-c6f1-4072-b66d-b6c3cb1791e8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.553690 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b895e470-c6f1-4072-b66d-b6c3cb1791e8-kube-api-access-zgrxp" (OuterVolumeSpecName: "kube-api-access-zgrxp") pod "b895e470-c6f1-4072-b66d-b6c3cb1791e8" (UID: "b895e470-c6f1-4072-b66d-b6c3cb1791e8"). InnerVolumeSpecName "kube-api-access-zgrxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.647974 4867 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b895e470-c6f1-4072-b66d-b6c3cb1791e8-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.648004 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b895e470-c6f1-4072-b66d-b6c3cb1791e8-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.648015 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgrxp\" (UniqueName: \"kubernetes.io/projected/b895e470-c6f1-4072-b66d-b6c3cb1791e8-kube-api-access-zgrxp\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.681219 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8d9d-account-create-update-q8sjd" Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.852197 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr9fl\" (UniqueName: \"kubernetes.io/projected/a8f85ac5-7d71-4b5f-ae85-67b245b18c18-kube-api-access-gr9fl\") pod \"a8f85ac5-7d71-4b5f-ae85-67b245b18c18\" (UID: \"a8f85ac5-7d71-4b5f-ae85-67b245b18c18\") " Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.852286 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8f85ac5-7d71-4b5f-ae85-67b245b18c18-operator-scripts\") pod \"a8f85ac5-7d71-4b5f-ae85-67b245b18c18\" (UID: \"a8f85ac5-7d71-4b5f-ae85-67b245b18c18\") " Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.852875 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8f85ac5-7d71-4b5f-ae85-67b245b18c18-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a8f85ac5-7d71-4b5f-ae85-67b245b18c18" (UID: "a8f85ac5-7d71-4b5f-ae85-67b245b18c18"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.857407 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8f85ac5-7d71-4b5f-ae85-67b245b18c18-kube-api-access-gr9fl" (OuterVolumeSpecName: "kube-api-access-gr9fl") pod "a8f85ac5-7d71-4b5f-ae85-67b245b18c18" (UID: "a8f85ac5-7d71-4b5f-ae85-67b245b18c18"). InnerVolumeSpecName "kube-api-access-gr9fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.932079 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z" Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.932078 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z" event={"ID":"b895e470-c6f1-4072-b66d-b6c3cb1791e8","Type":"ContainerDied","Data":"a9439d6639712462f18577de1fc5e27800d26f9b7b03a357f2c8c6b1fd540f21"} Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.932245 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9439d6639712462f18577de1fc5e27800d26f9b7b03a357f2c8c6b1fd540f21" Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.933682 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2eafa378-9a4a-4028-a0cb-fbb55fa15972","Type":"ContainerStarted","Data":"34a0c4a81de1bb4129e5d06558eddcb546c909993d7ec0987b5f53321ba11299"} Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.935156 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8d9d-account-create-update-q8sjd" Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.940649 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8d9d-account-create-update-q8sjd" event={"ID":"a8f85ac5-7d71-4b5f-ae85-67b245b18c18","Type":"ContainerDied","Data":"a8fecc296a053199874da7cbd9d076913ceffe864ed25a3a345700f6e55ffd47"} Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.940694 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8fecc296a053199874da7cbd9d076913ceffe864ed25a3a345700f6e55ffd47" Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.955355 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr9fl\" (UniqueName: \"kubernetes.io/projected/a8f85ac5-7d71-4b5f-ae85-67b245b18c18-kube-api-access-gr9fl\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:07 crc kubenswrapper[4867]: I1201 09:30:07.955386 4867 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8f85ac5-7d71-4b5f-ae85-67b245b18c18-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.509370 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c564-account-create-update-h6vqd" Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.568573 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w7rr\" (UniqueName: \"kubernetes.io/projected/3e5461b6-70c3-4b0a-aea5-827baa9fc665-kube-api-access-7w7rr\") pod \"3e5461b6-70c3-4b0a-aea5-827baa9fc665\" (UID: \"3e5461b6-70c3-4b0a-aea5-827baa9fc665\") " Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.568614 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e5461b6-70c3-4b0a-aea5-827baa9fc665-operator-scripts\") pod \"3e5461b6-70c3-4b0a-aea5-827baa9fc665\" (UID: \"3e5461b6-70c3-4b0a-aea5-827baa9fc665\") " Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.569897 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e5461b6-70c3-4b0a-aea5-827baa9fc665-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3e5461b6-70c3-4b0a-aea5-827baa9fc665" (UID: "3e5461b6-70c3-4b0a-aea5-827baa9fc665"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.591360 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e5461b6-70c3-4b0a-aea5-827baa9fc665-kube-api-access-7w7rr" (OuterVolumeSpecName: "kube-api-access-7w7rr") pod "3e5461b6-70c3-4b0a-aea5-827baa9fc665" (UID: "3e5461b6-70c3-4b0a-aea5-827baa9fc665"). InnerVolumeSpecName "kube-api-access-7w7rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.670830 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w7rr\" (UniqueName: \"kubernetes.io/projected/3e5461b6-70c3-4b0a-aea5-827baa9fc665-kube-api-access-7w7rr\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.671180 4867 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e5461b6-70c3-4b0a-aea5-827baa9fc665-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.718928 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kwrll" Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.775573 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89cd7bb8-e306-4d6d-96a9-33e00ed9f194-operator-scripts\") pod \"89cd7bb8-e306-4d6d-96a9-33e00ed9f194\" (UID: \"89cd7bb8-e306-4d6d-96a9-33e00ed9f194\") " Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.775735 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npr6m\" (UniqueName: \"kubernetes.io/projected/89cd7bb8-e306-4d6d-96a9-33e00ed9f194-kube-api-access-npr6m\") pod \"89cd7bb8-e306-4d6d-96a9-33e00ed9f194\" (UID: \"89cd7bb8-e306-4d6d-96a9-33e00ed9f194\") " Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.780522 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89cd7bb8-e306-4d6d-96a9-33e00ed9f194-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "89cd7bb8-e306-4d6d-96a9-33e00ed9f194" (UID: "89cd7bb8-e306-4d6d-96a9-33e00ed9f194"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.794269 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89cd7bb8-e306-4d6d-96a9-33e00ed9f194-kube-api-access-npr6m" (OuterVolumeSpecName: "kube-api-access-npr6m") pod "89cd7bb8-e306-4d6d-96a9-33e00ed9f194" (UID: "89cd7bb8-e306-4d6d-96a9-33e00ed9f194"). InnerVolumeSpecName "kube-api-access-npr6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.878456 4867 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89cd7bb8-e306-4d6d-96a9-33e00ed9f194-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.878503 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npr6m\" (UniqueName: \"kubernetes.io/projected/89cd7bb8-e306-4d6d-96a9-33e00ed9f194-kube-api-access-npr6m\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.949997 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kwrll" event={"ID":"89cd7bb8-e306-4d6d-96a9-33e00ed9f194","Type":"ContainerDied","Data":"8d969fbf16b5d8d1a017f4d9053fa2c4dce1886983a4f23abaee834930d3ec07"} Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.950035 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d969fbf16b5d8d1a017f4d9053fa2c4dce1886983a4f23abaee834930d3ec07" Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.950097 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kwrll" Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.957235 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2eafa378-9a4a-4028-a0cb-fbb55fa15972","Type":"ContainerStarted","Data":"8163eb01f825678238e1d90d70285bd60f1cbc2337d58fb0372fc95e5b2fd651"} Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.957286 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2eafa378-9a4a-4028-a0cb-fbb55fa15972","Type":"ContainerStarted","Data":"0c414ebcbdc697cbaed9de55d6ccc1a0159a0be1d4d49a29a38ea5dd5b2ff49a"} Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.960196 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c564-account-create-update-h6vqd" event={"ID":"3e5461b6-70c3-4b0a-aea5-827baa9fc665","Type":"ContainerDied","Data":"f4054d837c16ee98f9d9b7456b1536790ce64c8dd3d31494697026330d0772ff"} Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.960254 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4054d837c16ee98f9d9b7456b1536790ce64c8dd3d31494697026330d0772ff" Dec 01 09:30:08 crc kubenswrapper[4867]: I1201 09:30:08.960327 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c564-account-create-update-h6vqd" Dec 01 09:30:10 crc kubenswrapper[4867]: I1201 09:30:10.991688 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:30:10 crc kubenswrapper[4867]: I1201 09:30:10.997392 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-9dcc6b98f-chkvz" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.434106 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cdt5g"] Dec 01 09:30:12 crc kubenswrapper[4867]: E1201 09:30:12.434755 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ea5fe34-452d-4805-8bda-47d8f1ab2381" containerName="mariadb-database-create" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.434768 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ea5fe34-452d-4805-8bda-47d8f1ab2381" containerName="mariadb-database-create" Dec 01 09:30:12 crc kubenswrapper[4867]: E1201 09:30:12.434778 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b895e470-c6f1-4072-b66d-b6c3cb1791e8" containerName="collect-profiles" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.434784 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b895e470-c6f1-4072-b66d-b6c3cb1791e8" containerName="collect-profiles" Dec 01 09:30:12 crc kubenswrapper[4867]: E1201 09:30:12.434797 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89cd7bb8-e306-4d6d-96a9-33e00ed9f194" containerName="mariadb-database-create" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.434804 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="89cd7bb8-e306-4d6d-96a9-33e00ed9f194" containerName="mariadb-database-create" Dec 01 09:30:12 crc kubenswrapper[4867]: E1201 09:30:12.434843 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf1dfe6-d703-43e9-9aca-436d6b37c2e9" containerName="mariadb-account-create-update" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.434850 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf1dfe6-d703-43e9-9aca-436d6b37c2e9" containerName="mariadb-account-create-update" Dec 01 09:30:12 crc kubenswrapper[4867]: E1201 09:30:12.434861 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e5461b6-70c3-4b0a-aea5-827baa9fc665" containerName="mariadb-account-create-update" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.434867 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e5461b6-70c3-4b0a-aea5-827baa9fc665" containerName="mariadb-account-create-update" Dec 01 09:30:12 crc kubenswrapper[4867]: E1201 09:30:12.434875 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37bc51bf-0822-420e-8d4a-5cb236dd83e4" containerName="mariadb-database-create" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.434881 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="37bc51bf-0822-420e-8d4a-5cb236dd83e4" containerName="mariadb-database-create" Dec 01 09:30:12 crc kubenswrapper[4867]: E1201 09:30:12.434890 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8f85ac5-7d71-4b5f-ae85-67b245b18c18" containerName="mariadb-account-create-update" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.434895 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8f85ac5-7d71-4b5f-ae85-67b245b18c18" containerName="mariadb-account-create-update" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.435054 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8f85ac5-7d71-4b5f-ae85-67b245b18c18" containerName="mariadb-account-create-update" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.435064 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="89cd7bb8-e306-4d6d-96a9-33e00ed9f194" containerName="mariadb-database-create" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.435088 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e5461b6-70c3-4b0a-aea5-827baa9fc665" containerName="mariadb-account-create-update" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.435100 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b895e470-c6f1-4072-b66d-b6c3cb1791e8" containerName="collect-profiles" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.435109 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ea5fe34-452d-4805-8bda-47d8f1ab2381" containerName="mariadb-database-create" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.435119 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="37bc51bf-0822-420e-8d4a-5cb236dd83e4" containerName="mariadb-database-create" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.435130 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf1dfe6-d703-43e9-9aca-436d6b37c2e9" containerName="mariadb-account-create-update" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.435687 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cdt5g" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.437574 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-cbblx" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.441360 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.454641 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cdt5g"] Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.475045 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.565091 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fml89\" (UniqueName: \"kubernetes.io/projected/e83471d7-4d9d-427c-b769-bd072acbaae0-kube-api-access-fml89\") pod \"nova-cell0-conductor-db-sync-cdt5g\" (UID: \"e83471d7-4d9d-427c-b769-bd072acbaae0\") " pod="openstack/nova-cell0-conductor-db-sync-cdt5g" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.565153 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83471d7-4d9d-427c-b769-bd072acbaae0-config-data\") pod \"nova-cell0-conductor-db-sync-cdt5g\" (UID: \"e83471d7-4d9d-427c-b769-bd072acbaae0\") " pod="openstack/nova-cell0-conductor-db-sync-cdt5g" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.565367 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83471d7-4d9d-427c-b769-bd072acbaae0-scripts\") pod \"nova-cell0-conductor-db-sync-cdt5g\" (UID: \"e83471d7-4d9d-427c-b769-bd072acbaae0\") " pod="openstack/nova-cell0-conductor-db-sync-cdt5g" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.565418 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83471d7-4d9d-427c-b769-bd072acbaae0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cdt5g\" (UID: \"e83471d7-4d9d-427c-b769-bd072acbaae0\") " pod="openstack/nova-cell0-conductor-db-sync-cdt5g" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.667235 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83471d7-4d9d-427c-b769-bd072acbaae0-scripts\") pod \"nova-cell0-conductor-db-sync-cdt5g\" (UID: \"e83471d7-4d9d-427c-b769-bd072acbaae0\") " pod="openstack/nova-cell0-conductor-db-sync-cdt5g" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.667287 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83471d7-4d9d-427c-b769-bd072acbaae0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cdt5g\" (UID: \"e83471d7-4d9d-427c-b769-bd072acbaae0\") " pod="openstack/nova-cell0-conductor-db-sync-cdt5g" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.667382 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fml89\" (UniqueName: \"kubernetes.io/projected/e83471d7-4d9d-427c-b769-bd072acbaae0-kube-api-access-fml89\") pod \"nova-cell0-conductor-db-sync-cdt5g\" (UID: \"e83471d7-4d9d-427c-b769-bd072acbaae0\") " pod="openstack/nova-cell0-conductor-db-sync-cdt5g" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.667401 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83471d7-4d9d-427c-b769-bd072acbaae0-config-data\") pod \"nova-cell0-conductor-db-sync-cdt5g\" (UID: \"e83471d7-4d9d-427c-b769-bd072acbaae0\") " pod="openstack/nova-cell0-conductor-db-sync-cdt5g" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.676012 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83471d7-4d9d-427c-b769-bd072acbaae0-config-data\") pod \"nova-cell0-conductor-db-sync-cdt5g\" (UID: \"e83471d7-4d9d-427c-b769-bd072acbaae0\") " pod="openstack/nova-cell0-conductor-db-sync-cdt5g" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.676013 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83471d7-4d9d-427c-b769-bd072acbaae0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cdt5g\" (UID: \"e83471d7-4d9d-427c-b769-bd072acbaae0\") " pod="openstack/nova-cell0-conductor-db-sync-cdt5g" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.676612 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83471d7-4d9d-427c-b769-bd072acbaae0-scripts\") pod \"nova-cell0-conductor-db-sync-cdt5g\" (UID: \"e83471d7-4d9d-427c-b769-bd072acbaae0\") " pod="openstack/nova-cell0-conductor-db-sync-cdt5g" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.691246 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fml89\" (UniqueName: \"kubernetes.io/projected/e83471d7-4d9d-427c-b769-bd072acbaae0-kube-api-access-fml89\") pod \"nova-cell0-conductor-db-sync-cdt5g\" (UID: \"e83471d7-4d9d-427c-b769-bd072acbaae0\") " pod="openstack/nova-cell0-conductor-db-sync-cdt5g" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.758490 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cdt5g" Dec 01 09:30:12 crc kubenswrapper[4867]: I1201 09:30:12.907772 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-c846795f4-k7mlj" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Dec 01 09:30:13 crc kubenswrapper[4867]: I1201 09:30:13.293372 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:30:13 crc kubenswrapper[4867]: I1201 09:30:13.294355 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:30:13 crc kubenswrapper[4867]: I1201 09:30:13.294796 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-d47c7cb76-srf4p" podUID="8bd4fac2-df2c-4aab-bf00-99b54a83ddca" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 01 09:30:13 crc kubenswrapper[4867]: I1201 09:30:13.347973 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cdt5g"] Dec 01 09:30:13 crc kubenswrapper[4867]: W1201 09:30:13.350562 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode83471d7_4d9d_427c_b769_bd072acbaae0.slice/crio-f84aeb93a79a6a9067c2ec22c426a95f0a177b3b16005ffb912726368c59e765 WatchSource:0}: Error finding container f84aeb93a79a6a9067c2ec22c426a95f0a177b3b16005ffb912726368c59e765: Status 404 returned error can't find the container with id f84aeb93a79a6a9067c2ec22c426a95f0a177b3b16005ffb912726368c59e765 Dec 01 09:30:14 crc kubenswrapper[4867]: I1201 09:30:14.022977 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cdt5g" event={"ID":"e83471d7-4d9d-427c-b769-bd072acbaae0","Type":"ContainerStarted","Data":"f84aeb93a79a6a9067c2ec22c426a95f0a177b3b16005ffb912726368c59e765"} Dec 01 09:30:15 crc kubenswrapper[4867]: I1201 09:30:15.045251 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2eafa378-9a4a-4028-a0cb-fbb55fa15972","Type":"ContainerStarted","Data":"2bcb281e6c5cfdef3c3ac69f23f193b3da184dadb14a85047bd81fa3047fae34"} Dec 01 09:30:15 crc kubenswrapper[4867]: I1201 09:30:15.047348 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 09:30:15 crc kubenswrapper[4867]: I1201 09:30:15.070858 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.469294752 podStartE2EDuration="14.070843232s" podCreationTimestamp="2025-12-01 09:30:01 +0000 UTC" firstStartedPulling="2025-12-01 09:30:03.621500655 +0000 UTC m=+1325.080887409" lastFinishedPulling="2025-12-01 09:30:14.223049135 +0000 UTC m=+1335.682435889" observedRunningTime="2025-12-01 09:30:15.065131516 +0000 UTC m=+1336.524518270" watchObservedRunningTime="2025-12-01 09:30:15.070843232 +0000 UTC m=+1336.530229986" Dec 01 09:30:15 crc kubenswrapper[4867]: I1201 09:30:15.933183 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:17 crc kubenswrapper[4867]: I1201 09:30:17.069243 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" containerName="ceilometer-central-agent" containerID="cri-o://34a0c4a81de1bb4129e5d06558eddcb546c909993d7ec0987b5f53321ba11299" gracePeriod=30 Dec 01 09:30:17 crc kubenswrapper[4867]: I1201 09:30:17.070654 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" containerName="proxy-httpd" containerID="cri-o://2bcb281e6c5cfdef3c3ac69f23f193b3da184dadb14a85047bd81fa3047fae34" gracePeriod=30 Dec 01 09:30:17 crc kubenswrapper[4867]: I1201 09:30:17.070779 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" containerName="sg-core" containerID="cri-o://8163eb01f825678238e1d90d70285bd60f1cbc2337d58fb0372fc95e5b2fd651" gracePeriod=30 Dec 01 09:30:17 crc kubenswrapper[4867]: I1201 09:30:17.070902 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" containerName="ceilometer-notification-agent" containerID="cri-o://0c414ebcbdc697cbaed9de55d6ccc1a0159a0be1d4d49a29a38ea5dd5b2ff49a" gracePeriod=30 Dec 01 09:30:18 crc kubenswrapper[4867]: I1201 09:30:18.084786 4867 generic.go:334] "Generic (PLEG): container finished" podID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" containerID="2bcb281e6c5cfdef3c3ac69f23f193b3da184dadb14a85047bd81fa3047fae34" exitCode=0 Dec 01 09:30:18 crc kubenswrapper[4867]: I1201 09:30:18.085077 4867 generic.go:334] "Generic (PLEG): container finished" podID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" containerID="8163eb01f825678238e1d90d70285bd60f1cbc2337d58fb0372fc95e5b2fd651" exitCode=2 Dec 01 09:30:18 crc kubenswrapper[4867]: I1201 09:30:18.084846 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2eafa378-9a4a-4028-a0cb-fbb55fa15972","Type":"ContainerDied","Data":"2bcb281e6c5cfdef3c3ac69f23f193b3da184dadb14a85047bd81fa3047fae34"} Dec 01 09:30:18 crc kubenswrapper[4867]: I1201 09:30:18.085120 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2eafa378-9a4a-4028-a0cb-fbb55fa15972","Type":"ContainerDied","Data":"8163eb01f825678238e1d90d70285bd60f1cbc2337d58fb0372fc95e5b2fd651"} Dec 01 09:30:18 crc kubenswrapper[4867]: I1201 09:30:18.085134 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2eafa378-9a4a-4028-a0cb-fbb55fa15972","Type":"ContainerDied","Data":"0c414ebcbdc697cbaed9de55d6ccc1a0159a0be1d4d49a29a38ea5dd5b2ff49a"} Dec 01 09:30:18 crc kubenswrapper[4867]: I1201 09:30:18.085088 4867 generic.go:334] "Generic (PLEG): container finished" podID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" containerID="0c414ebcbdc697cbaed9de55d6ccc1a0159a0be1d4d49a29a38ea5dd5b2ff49a" exitCode=0 Dec 01 09:30:21 crc kubenswrapper[4867]: I1201 09:30:21.601562 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:30:21 crc kubenswrapper[4867]: I1201 09:30:21.602295 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:30:21 crc kubenswrapper[4867]: I1201 09:30:21.602354 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:30:21 crc kubenswrapper[4867]: I1201 09:30:21.603062 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7c10630f55a5e3f1966308004b9596564bba3f48b49f2091a432ccd55427b09a"} pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:30:21 crc kubenswrapper[4867]: I1201 09:30:21.603106 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" containerID="cri-o://7c10630f55a5e3f1966308004b9596564bba3f48b49f2091a432ccd55427b09a" gracePeriod=600 Dec 01 09:30:22 crc kubenswrapper[4867]: I1201 09:30:22.130965 4867 generic.go:334] "Generic (PLEG): container finished" podID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" containerID="34a0c4a81de1bb4129e5d06558eddcb546c909993d7ec0987b5f53321ba11299" exitCode=0 Dec 01 09:30:22 crc kubenswrapper[4867]: I1201 09:30:22.131041 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2eafa378-9a4a-4028-a0cb-fbb55fa15972","Type":"ContainerDied","Data":"34a0c4a81de1bb4129e5d06558eddcb546c909993d7ec0987b5f53321ba11299"} Dec 01 09:30:22 crc kubenswrapper[4867]: I1201 09:30:22.134447 4867 generic.go:334] "Generic (PLEG): container finished" podID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerID="7c10630f55a5e3f1966308004b9596564bba3f48b49f2091a432ccd55427b09a" exitCode=0 Dec 01 09:30:22 crc kubenswrapper[4867]: I1201 09:30:22.134488 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerDied","Data":"7c10630f55a5e3f1966308004b9596564bba3f48b49f2091a432ccd55427b09a"} Dec 01 09:30:22 crc kubenswrapper[4867]: I1201 09:30:22.134521 4867 scope.go:117] "RemoveContainer" containerID="3efb00c27c0eaaf97b5cf3c44be1e5a5598923c3a199804003a6c5848c9f9cea" Dec 01 09:30:22 crc kubenswrapper[4867]: I1201 09:30:22.907844 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-c846795f4-k7mlj" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Dec 01 09:30:23 crc kubenswrapper[4867]: I1201 09:30:23.295347 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-d47c7cb76-srf4p" podUID="8bd4fac2-df2c-4aab-bf00-99b54a83ddca" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.234795 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2eafa378-9a4a-4028-a0cb-fbb55fa15972","Type":"ContainerDied","Data":"3129a4d600de30abf4a3db2ab620900f2dff57b3661f9fd110a492533b87d3f9"} Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.235387 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3129a4d600de30abf4a3db2ab620900f2dff57b3661f9fd110a492533b87d3f9" Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.347553 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.385538 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-combined-ca-bundle\") pod \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.385580 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-scripts\") pod \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.385613 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-sg-core-conf-yaml\") pod \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.385649 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2eafa378-9a4a-4028-a0cb-fbb55fa15972-run-httpd\") pod \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.385677 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2eafa378-9a4a-4028-a0cb-fbb55fa15972-log-httpd\") pod \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.385701 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r69lr\" (UniqueName: \"kubernetes.io/projected/2eafa378-9a4a-4028-a0cb-fbb55fa15972-kube-api-access-r69lr\") pod \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.385778 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-config-data\") pod \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\" (UID: \"2eafa378-9a4a-4028-a0cb-fbb55fa15972\") " Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.395232 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eafa378-9a4a-4028-a0cb-fbb55fa15972-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2eafa378-9a4a-4028-a0cb-fbb55fa15972" (UID: "2eafa378-9a4a-4028-a0cb-fbb55fa15972"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.396479 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eafa378-9a4a-4028-a0cb-fbb55fa15972-kube-api-access-r69lr" (OuterVolumeSpecName: "kube-api-access-r69lr") pod "2eafa378-9a4a-4028-a0cb-fbb55fa15972" (UID: "2eafa378-9a4a-4028-a0cb-fbb55fa15972"). InnerVolumeSpecName "kube-api-access-r69lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.396793 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eafa378-9a4a-4028-a0cb-fbb55fa15972-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2eafa378-9a4a-4028-a0cb-fbb55fa15972" (UID: "2eafa378-9a4a-4028-a0cb-fbb55fa15972"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.414964 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-scripts" (OuterVolumeSpecName: "scripts") pod "2eafa378-9a4a-4028-a0cb-fbb55fa15972" (UID: "2eafa378-9a4a-4028-a0cb-fbb55fa15972"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.489661 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.489700 4867 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2eafa378-9a4a-4028-a0cb-fbb55fa15972-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.489714 4867 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2eafa378-9a4a-4028-a0cb-fbb55fa15972-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.489726 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r69lr\" (UniqueName: \"kubernetes.io/projected/2eafa378-9a4a-4028-a0cb-fbb55fa15972-kube-api-access-r69lr\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.496292 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2eafa378-9a4a-4028-a0cb-fbb55fa15972" (UID: "2eafa378-9a4a-4028-a0cb-fbb55fa15972"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.570026 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2eafa378-9a4a-4028-a0cb-fbb55fa15972" (UID: "2eafa378-9a4a-4028-a0cb-fbb55fa15972"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.591200 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.591247 4867 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.645230 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-config-data" (OuterVolumeSpecName: "config-data") pod "2eafa378-9a4a-4028-a0cb-fbb55fa15972" (UID: "2eafa378-9a4a-4028-a0cb-fbb55fa15972"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.692500 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eafa378-9a4a-4028-a0cb-fbb55fa15972-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:32 crc kubenswrapper[4867]: E1201 09:30:32.847194 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified" Dec 01 09:30:32 crc kubenswrapper[4867]: E1201 09:30:32.847414 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nova-cell0-conductor-db-sync,Image:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CELL_NAME,Value:cell0,ValueFrom:nil,},EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:false,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/kolla/config_files/config.json,SubPath:nova-conductor-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fml89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42436,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-cell0-conductor-db-sync-cdt5g_openstack(e83471d7-4d9d-427c-b769-bd072acbaae0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:30:32 crc kubenswrapper[4867]: E1201 09:30:32.848593 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/nova-cell0-conductor-db-sync-cdt5g" podUID="e83471d7-4d9d-427c-b769-bd072acbaae0" Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.906345 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-c846795f4-k7mlj" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.906489 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.907754 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"c7ec8779b58f97fafd5134c7e65888047b97ae6e37afffbfa008a76f648c7186"} pod="openstack/horizon-c846795f4-k7mlj" containerMessage="Container horizon failed startup probe, will be restarted" Dec 01 09:30:32 crc kubenswrapper[4867]: I1201 09:30:32.907854 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-c846795f4-k7mlj" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" containerID="cri-o://c7ec8779b58f97fafd5134c7e65888047b97ae6e37afffbfa008a76f648c7186" gracePeriod=30 Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.269351 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949"} Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.269688 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: E1201 09:30:33.272404 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified\\\"\"" pod="openstack/nova-cell0-conductor-db-sync-cdt5g" podUID="e83471d7-4d9d-427c-b769-bd072acbaae0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.293613 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-d47c7cb76-srf4p" podUID="8bd4fac2-df2c-4aab-bf00-99b54a83ddca" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.293706 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.294635 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"1ea30f76a6e3c0a162d6bb9415d525b9c47c036354111f0ab5aa32652d0af895"} pod="openstack/horizon-d47c7cb76-srf4p" containerMessage="Container horizon failed startup probe, will be restarted" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.294680 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-d47c7cb76-srf4p" podUID="8bd4fac2-df2c-4aab-bf00-99b54a83ddca" containerName="horizon" containerID="cri-o://1ea30f76a6e3c0a162d6bb9415d525b9c47c036354111f0ab5aa32652d0af895" gracePeriod=30 Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.348847 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.361534 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.390470 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:33 crc kubenswrapper[4867]: E1201 09:30:33.393523 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" containerName="ceilometer-central-agent" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.393555 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" containerName="ceilometer-central-agent" Dec 01 09:30:33 crc kubenswrapper[4867]: E1201 09:30:33.393575 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" containerName="proxy-httpd" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.393583 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" containerName="proxy-httpd" Dec 01 09:30:33 crc kubenswrapper[4867]: E1201 09:30:33.393594 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" containerName="sg-core" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.393600 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" containerName="sg-core" Dec 01 09:30:33 crc kubenswrapper[4867]: E1201 09:30:33.393619 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" containerName="ceilometer-notification-agent" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.393627 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" containerName="ceilometer-notification-agent" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.393951 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" containerName="ceilometer-notification-agent" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.393972 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" containerName="ceilometer-central-agent" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.393994 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" containerName="sg-core" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.394006 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" containerName="proxy-httpd" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.395945 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.400617 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.400981 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.415640 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.415688 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-config-data\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.415765 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2d4t\" (UniqueName: \"kubernetes.io/projected/8d58b09d-4905-4deb-9aa4-8537d4ed954f-kube-api-access-q2d4t\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.415790 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d58b09d-4905-4deb-9aa4-8537d4ed954f-log-httpd\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.415835 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.415922 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-scripts\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.415944 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.415956 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d58b09d-4905-4deb-9aa4-8537d4ed954f-run-httpd\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.517059 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.517120 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-config-data\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.517201 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2d4t\" (UniqueName: \"kubernetes.io/projected/8d58b09d-4905-4deb-9aa4-8537d4ed954f-kube-api-access-q2d4t\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.517227 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d58b09d-4905-4deb-9aa4-8537d4ed954f-log-httpd\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.517254 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.517288 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-scripts\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.517306 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d58b09d-4905-4deb-9aa4-8537d4ed954f-run-httpd\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.517733 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d58b09d-4905-4deb-9aa4-8537d4ed954f-log-httpd\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.517777 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d58b09d-4905-4deb-9aa4-8537d4ed954f-run-httpd\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.522567 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.524022 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-config-data\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.527088 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-scripts\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.531989 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.543593 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2d4t\" (UniqueName: \"kubernetes.io/projected/8d58b09d-4905-4deb-9aa4-8537d4ed954f-kube-api-access-q2d4t\") pod \"ceilometer-0\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " pod="openstack/ceilometer-0" Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.640534 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:33 crc kubenswrapper[4867]: I1201 09:30:33.641364 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:30:34 crc kubenswrapper[4867]: I1201 09:30:34.010195 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:34 crc kubenswrapper[4867]: I1201 09:30:34.016929 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:30:34 crc kubenswrapper[4867]: I1201 09:30:34.278004 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d58b09d-4905-4deb-9aa4-8537d4ed954f","Type":"ContainerStarted","Data":"12512a850b5fb7de2678a4c4efab8b948c612e499b80d6edfc0b840f3a493348"} Dec 01 09:30:34 crc kubenswrapper[4867]: I1201 09:30:34.839407 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eafa378-9a4a-4028-a0cb-fbb55fa15972" path="/var/lib/kubelet/pods/2eafa378-9a4a-4028-a0cb-fbb55fa15972/volumes" Dec 01 09:30:35 crc kubenswrapper[4867]: I1201 09:30:35.286517 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d58b09d-4905-4deb-9aa4-8537d4ed954f","Type":"ContainerStarted","Data":"895ef1adb7e8bc3d03e00872fd06677d0f7848708942d213c579c8a52d00cc8c"} Dec 01 09:30:36 crc kubenswrapper[4867]: I1201 09:30:36.297743 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d58b09d-4905-4deb-9aa4-8537d4ed954f","Type":"ContainerStarted","Data":"c52a5c38b8fe8e640e01c44261b96866a13accede07427868d207ee7f5dd558c"} Dec 01 09:30:37 crc kubenswrapper[4867]: I1201 09:30:37.310378 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d58b09d-4905-4deb-9aa4-8537d4ed954f","Type":"ContainerStarted","Data":"c7d59fd11a3156c1bdb110ab36ee44302791a5cf1935ce3d87149c815c210ecc"} Dec 01 09:30:40 crc kubenswrapper[4867]: I1201 09:30:40.338397 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d58b09d-4905-4deb-9aa4-8537d4ed954f","Type":"ContainerStarted","Data":"4d0a3f4565639e4c78fcdc0eb5f0d0c2566e286e8839e45852bc8a7244f05795"} Dec 01 09:30:40 crc kubenswrapper[4867]: I1201 09:30:40.339007 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 09:30:40 crc kubenswrapper[4867]: I1201 09:30:40.338656 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" containerName="ceilometer-central-agent" containerID="cri-o://895ef1adb7e8bc3d03e00872fd06677d0f7848708942d213c579c8a52d00cc8c" gracePeriod=30 Dec 01 09:30:40 crc kubenswrapper[4867]: I1201 09:30:40.339130 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" containerName="sg-core" containerID="cri-o://c7d59fd11a3156c1bdb110ab36ee44302791a5cf1935ce3d87149c815c210ecc" gracePeriod=30 Dec 01 09:30:40 crc kubenswrapper[4867]: I1201 09:30:40.339173 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" containerName="ceilometer-notification-agent" containerID="cri-o://c52a5c38b8fe8e640e01c44261b96866a13accede07427868d207ee7f5dd558c" gracePeriod=30 Dec 01 09:30:40 crc kubenswrapper[4867]: I1201 09:30:40.339196 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" containerName="proxy-httpd" containerID="cri-o://4d0a3f4565639e4c78fcdc0eb5f0d0c2566e286e8839e45852bc8a7244f05795" gracePeriod=30 Dec 01 09:30:40 crc kubenswrapper[4867]: I1201 09:30:40.373657 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.384604058 podStartE2EDuration="7.373630605s" podCreationTimestamp="2025-12-01 09:30:33 +0000 UTC" firstStartedPulling="2025-12-01 09:30:34.016585531 +0000 UTC m=+1355.475972285" lastFinishedPulling="2025-12-01 09:30:39.005612078 +0000 UTC m=+1360.464998832" observedRunningTime="2025-12-01 09:30:40.372964767 +0000 UTC m=+1361.832351521" watchObservedRunningTime="2025-12-01 09:30:40.373630605 +0000 UTC m=+1361.833017379" Dec 01 09:30:41 crc kubenswrapper[4867]: I1201 09:30:41.350194 4867 generic.go:334] "Generic (PLEG): container finished" podID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" containerID="4d0a3f4565639e4c78fcdc0eb5f0d0c2566e286e8839e45852bc8a7244f05795" exitCode=0 Dec 01 09:30:41 crc kubenswrapper[4867]: I1201 09:30:41.350224 4867 generic.go:334] "Generic (PLEG): container finished" podID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" containerID="c7d59fd11a3156c1bdb110ab36ee44302791a5cf1935ce3d87149c815c210ecc" exitCode=2 Dec 01 09:30:41 crc kubenswrapper[4867]: I1201 09:30:41.350252 4867 generic.go:334] "Generic (PLEG): container finished" podID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" containerID="c52a5c38b8fe8e640e01c44261b96866a13accede07427868d207ee7f5dd558c" exitCode=0 Dec 01 09:30:41 crc kubenswrapper[4867]: I1201 09:30:41.350274 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d58b09d-4905-4deb-9aa4-8537d4ed954f","Type":"ContainerDied","Data":"4d0a3f4565639e4c78fcdc0eb5f0d0c2566e286e8839e45852bc8a7244f05795"} Dec 01 09:30:41 crc kubenswrapper[4867]: I1201 09:30:41.350332 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d58b09d-4905-4deb-9aa4-8537d4ed954f","Type":"ContainerDied","Data":"c7d59fd11a3156c1bdb110ab36ee44302791a5cf1935ce3d87149c815c210ecc"} Dec 01 09:30:41 crc kubenswrapper[4867]: I1201 09:30:41.350353 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d58b09d-4905-4deb-9aa4-8537d4ed954f","Type":"ContainerDied","Data":"c52a5c38b8fe8e640e01c44261b96866a13accede07427868d207ee7f5dd558c"} Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.382011 4867 generic.go:334] "Generic (PLEG): container finished" podID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" containerID="895ef1adb7e8bc3d03e00872fd06677d0f7848708942d213c579c8a52d00cc8c" exitCode=0 Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.382484 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d58b09d-4905-4deb-9aa4-8537d4ed954f","Type":"ContainerDied","Data":"895ef1adb7e8bc3d03e00872fd06677d0f7848708942d213c579c8a52d00cc8c"} Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.640783 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.805830 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d58b09d-4905-4deb-9aa4-8537d4ed954f-run-httpd\") pod \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.805886 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-scripts\") pod \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.806084 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-sg-core-conf-yaml\") pod \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.806140 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d58b09d-4905-4deb-9aa4-8537d4ed954f-log-httpd\") pod \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.806166 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-config-data\") pod \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.806200 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2d4t\" (UniqueName: \"kubernetes.io/projected/8d58b09d-4905-4deb-9aa4-8537d4ed954f-kube-api-access-q2d4t\") pod \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.806283 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-combined-ca-bundle\") pod \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\" (UID: \"8d58b09d-4905-4deb-9aa4-8537d4ed954f\") " Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.806369 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d58b09d-4905-4deb-9aa4-8537d4ed954f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8d58b09d-4905-4deb-9aa4-8537d4ed954f" (UID: "8d58b09d-4905-4deb-9aa4-8537d4ed954f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.806826 4867 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d58b09d-4905-4deb-9aa4-8537d4ed954f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.807476 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d58b09d-4905-4deb-9aa4-8537d4ed954f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8d58b09d-4905-4deb-9aa4-8537d4ed954f" (UID: "8d58b09d-4905-4deb-9aa4-8537d4ed954f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.823427 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-scripts" (OuterVolumeSpecName: "scripts") pod "8d58b09d-4905-4deb-9aa4-8537d4ed954f" (UID: "8d58b09d-4905-4deb-9aa4-8537d4ed954f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.823919 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d58b09d-4905-4deb-9aa4-8537d4ed954f-kube-api-access-q2d4t" (OuterVolumeSpecName: "kube-api-access-q2d4t") pod "8d58b09d-4905-4deb-9aa4-8537d4ed954f" (UID: "8d58b09d-4905-4deb-9aa4-8537d4ed954f"). InnerVolumeSpecName "kube-api-access-q2d4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.885767 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8d58b09d-4905-4deb-9aa4-8537d4ed954f" (UID: "8d58b09d-4905-4deb-9aa4-8537d4ed954f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.912010 4867 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.912327 4867 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d58b09d-4905-4deb-9aa4-8537d4ed954f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.912889 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2d4t\" (UniqueName: \"kubernetes.io/projected/8d58b09d-4905-4deb-9aa4-8537d4ed954f-kube-api-access-q2d4t\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.913090 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.919342 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d58b09d-4905-4deb-9aa4-8537d4ed954f" (UID: "8d58b09d-4905-4deb-9aa4-8537d4ed954f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:43 crc kubenswrapper[4867]: I1201 09:30:43.942360 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-config-data" (OuterVolumeSpecName: "config-data") pod "8d58b09d-4905-4deb-9aa4-8537d4ed954f" (UID: "8d58b09d-4905-4deb-9aa4-8537d4ed954f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.015404 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.015446 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d58b09d-4905-4deb-9aa4-8537d4ed954f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.393500 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d58b09d-4905-4deb-9aa4-8537d4ed954f","Type":"ContainerDied","Data":"12512a850b5fb7de2678a4c4efab8b948c612e499b80d6edfc0b840f3a493348"} Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.393556 4867 scope.go:117] "RemoveContainer" containerID="4d0a3f4565639e4c78fcdc0eb5f0d0c2566e286e8839e45852bc8a7244f05795" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.393578 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.430678 4867 scope.go:117] "RemoveContainer" containerID="c7d59fd11a3156c1bdb110ab36ee44302791a5cf1935ce3d87149c815c210ecc" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.434945 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.448762 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.453021 4867 scope.go:117] "RemoveContainer" containerID="c52a5c38b8fe8e640e01c44261b96866a13accede07427868d207ee7f5dd558c" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.466653 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:44 crc kubenswrapper[4867]: E1201 09:30:44.467290 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" containerName="sg-core" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.467399 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" containerName="sg-core" Dec 01 09:30:44 crc kubenswrapper[4867]: E1201 09:30:44.467491 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" containerName="ceilometer-notification-agent" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.467559 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" containerName="ceilometer-notification-agent" Dec 01 09:30:44 crc kubenswrapper[4867]: E1201 09:30:44.467626 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" containerName="ceilometer-central-agent" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.467704 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" containerName="ceilometer-central-agent" Dec 01 09:30:44 crc kubenswrapper[4867]: E1201 09:30:44.467778 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" containerName="proxy-httpd" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.467873 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" containerName="proxy-httpd" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.468244 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" containerName="ceilometer-notification-agent" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.468349 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" containerName="sg-core" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.468428 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" containerName="proxy-httpd" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.468517 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" containerName="ceilometer-central-agent" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.472843 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.477404 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.477752 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.480870 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.509285 4867 scope.go:117] "RemoveContainer" containerID="895ef1adb7e8bc3d03e00872fd06677d0f7848708942d213c579c8a52d00cc8c" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.625027 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-config-data\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.625398 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed0ac456-bbf3-4073-96be-6469b1f25a4c-log-httpd\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.625523 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.625622 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.625723 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-scripts\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.625844 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed0ac456-bbf3-4073-96be-6469b1f25a4c-run-httpd\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.625959 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbm9b\" (UniqueName: \"kubernetes.io/projected/ed0ac456-bbf3-4073-96be-6469b1f25a4c-kube-api-access-cbm9b\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.727888 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-config-data\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.727982 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed0ac456-bbf3-4073-96be-6469b1f25a4c-log-httpd\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.728007 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.728034 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.728052 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-scripts\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.728074 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed0ac456-bbf3-4073-96be-6469b1f25a4c-run-httpd\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.728103 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbm9b\" (UniqueName: \"kubernetes.io/projected/ed0ac456-bbf3-4073-96be-6469b1f25a4c-kube-api-access-cbm9b\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.729518 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed0ac456-bbf3-4073-96be-6469b1f25a4c-log-httpd\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.729670 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed0ac456-bbf3-4073-96be-6469b1f25a4c-run-httpd\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.734067 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.735396 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-config-data\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.736588 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-scripts\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.753977 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.755274 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbm9b\" (UniqueName: \"kubernetes.io/projected/ed0ac456-bbf3-4073-96be-6469b1f25a4c-kube-api-access-cbm9b\") pod \"ceilometer-0\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.826597 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:30:44 crc kubenswrapper[4867]: I1201 09:30:44.844293 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d58b09d-4905-4deb-9aa4-8537d4ed954f" path="/var/lib/kubelet/pods/8d58b09d-4905-4deb-9aa4-8537d4ed954f/volumes" Dec 01 09:30:45 crc kubenswrapper[4867]: I1201 09:30:45.378415 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:45 crc kubenswrapper[4867]: I1201 09:30:45.429573 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed0ac456-bbf3-4073-96be-6469b1f25a4c","Type":"ContainerStarted","Data":"eca7e389536b548d48fa60663354673147d4830111a144a92e865dab88c332c5"} Dec 01 09:30:45 crc kubenswrapper[4867]: I1201 09:30:45.922905 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 09:30:45 crc kubenswrapper[4867]: I1201 09:30:45.923789 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="94647b4f-f18c-4010-8573-b36075f21ecc" containerName="glance-log" containerID="cri-o://c83a226124ce3e9eb3d129b3bc5a6eec55ad521347829f347ddc31dbabf161a7" gracePeriod=30 Dec 01 09:30:45 crc kubenswrapper[4867]: I1201 09:30:45.923839 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="94647b4f-f18c-4010-8573-b36075f21ecc" containerName="glance-httpd" containerID="cri-o://e2e2a867e1f7c3c2c7373d2c831ddca3c8248d74e8fb2a26196b9274e44e5b48" gracePeriod=30 Dec 01 09:30:46 crc kubenswrapper[4867]: I1201 09:30:46.444072 4867 generic.go:334] "Generic (PLEG): container finished" podID="94647b4f-f18c-4010-8573-b36075f21ecc" containerID="c83a226124ce3e9eb3d129b3bc5a6eec55ad521347829f347ddc31dbabf161a7" exitCode=143 Dec 01 09:30:46 crc kubenswrapper[4867]: I1201 09:30:46.444345 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"94647b4f-f18c-4010-8573-b36075f21ecc","Type":"ContainerDied","Data":"c83a226124ce3e9eb3d129b3bc5a6eec55ad521347829f347ddc31dbabf161a7"} Dec 01 09:30:46 crc kubenswrapper[4867]: I1201 09:30:46.445589 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed0ac456-bbf3-4073-96be-6469b1f25a4c","Type":"ContainerStarted","Data":"e5d50f1aa0c7a81125074ee55036cc118072bd9e9e54bcd2e9ffed3aba066a77"} Dec 01 09:30:47 crc kubenswrapper[4867]: I1201 09:30:47.093306 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 09:30:47 crc kubenswrapper[4867]: I1201 09:30:47.093935 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1179a242-dbf2-4bc1-888b-33f22df356a6" containerName="glance-log" containerID="cri-o://be9c99f1cc0cad5bda870353e0af813dd03a1eacd33fa1fe8d450a2d291b1798" gracePeriod=30 Dec 01 09:30:47 crc kubenswrapper[4867]: I1201 09:30:47.094511 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1179a242-dbf2-4bc1-888b-33f22df356a6" containerName="glance-httpd" containerID="cri-o://86061cced9fc07099d9d68d5d3403011f3b13c3c5b735f6f29f779c23f098583" gracePeriod=30 Dec 01 09:30:47 crc kubenswrapper[4867]: I1201 09:30:47.456903 4867 generic.go:334] "Generic (PLEG): container finished" podID="1179a242-dbf2-4bc1-888b-33f22df356a6" containerID="be9c99f1cc0cad5bda870353e0af813dd03a1eacd33fa1fe8d450a2d291b1798" exitCode=143 Dec 01 09:30:47 crc kubenswrapper[4867]: I1201 09:30:47.456980 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1179a242-dbf2-4bc1-888b-33f22df356a6","Type":"ContainerDied","Data":"be9c99f1cc0cad5bda870353e0af813dd03a1eacd33fa1fe8d450a2d291b1798"} Dec 01 09:30:47 crc kubenswrapper[4867]: I1201 09:30:47.459250 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed0ac456-bbf3-4073-96be-6469b1f25a4c","Type":"ContainerStarted","Data":"ad2c27abcf450a41455c44a175317eba22ee4d000930026bd261c96b116c861d"} Dec 01 09:30:47 crc kubenswrapper[4867]: I1201 09:30:47.461129 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cdt5g" event={"ID":"e83471d7-4d9d-427c-b769-bd072acbaae0","Type":"ContainerStarted","Data":"e2f8e9d387cdafacb10858ccb0a575eab5feec4f6cad38bf28df16f55c143b64"} Dec 01 09:30:47 crc kubenswrapper[4867]: I1201 09:30:47.486102 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-cdt5g" podStartSLOduration=2.517405305 podStartE2EDuration="35.486083212s" podCreationTimestamp="2025-12-01 09:30:12 +0000 UTC" firstStartedPulling="2025-12-01 09:30:13.352384792 +0000 UTC m=+1334.811771546" lastFinishedPulling="2025-12-01 09:30:46.321062699 +0000 UTC m=+1367.780449453" observedRunningTime="2025-12-01 09:30:47.483340146 +0000 UTC m=+1368.942726900" watchObservedRunningTime="2025-12-01 09:30:47.486083212 +0000 UTC m=+1368.945469956" Dec 01 09:30:47 crc kubenswrapper[4867]: I1201 09:30:47.722929 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:48 crc kubenswrapper[4867]: I1201 09:30:48.471458 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed0ac456-bbf3-4073-96be-6469b1f25a4c","Type":"ContainerStarted","Data":"d6567553e8085d8f8993a5e3a9998ec35d1a91810c105af6be9c67718f31a16c"} Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.134159 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.226770 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94647b4f-f18c-4010-8573-b36075f21ecc-httpd-run\") pod \"94647b4f-f18c-4010-8573-b36075f21ecc\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.227659 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94647b4f-f18c-4010-8573-b36075f21ecc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "94647b4f-f18c-4010-8573-b36075f21ecc" (UID: "94647b4f-f18c-4010-8573-b36075f21ecc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.227715 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"94647b4f-f18c-4010-8573-b36075f21ecc\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.227753 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94647b4f-f18c-4010-8573-b36075f21ecc-logs\") pod \"94647b4f-f18c-4010-8573-b36075f21ecc\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.227799 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-scripts\") pod \"94647b4f-f18c-4010-8573-b36075f21ecc\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.227859 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-combined-ca-bundle\") pod \"94647b4f-f18c-4010-8573-b36075f21ecc\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.227904 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-config-data\") pod \"94647b4f-f18c-4010-8573-b36075f21ecc\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.227988 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q5tf\" (UniqueName: \"kubernetes.io/projected/94647b4f-f18c-4010-8573-b36075f21ecc-kube-api-access-8q5tf\") pod \"94647b4f-f18c-4010-8573-b36075f21ecc\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.228028 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-public-tls-certs\") pod \"94647b4f-f18c-4010-8573-b36075f21ecc\" (UID: \"94647b4f-f18c-4010-8573-b36075f21ecc\") " Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.228829 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94647b4f-f18c-4010-8573-b36075f21ecc-logs" (OuterVolumeSpecName: "logs") pod "94647b4f-f18c-4010-8573-b36075f21ecc" (UID: "94647b4f-f18c-4010-8573-b36075f21ecc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.229209 4867 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94647b4f-f18c-4010-8573-b36075f21ecc-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.229228 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94647b4f-f18c-4010-8573-b36075f21ecc-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.249490 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "94647b4f-f18c-4010-8573-b36075f21ecc" (UID: "94647b4f-f18c-4010-8573-b36075f21ecc"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.254015 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94647b4f-f18c-4010-8573-b36075f21ecc-kube-api-access-8q5tf" (OuterVolumeSpecName: "kube-api-access-8q5tf") pod "94647b4f-f18c-4010-8573-b36075f21ecc" (UID: "94647b4f-f18c-4010-8573-b36075f21ecc"). InnerVolumeSpecName "kube-api-access-8q5tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.254566 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-scripts" (OuterVolumeSpecName: "scripts") pod "94647b4f-f18c-4010-8573-b36075f21ecc" (UID: "94647b4f-f18c-4010-8573-b36075f21ecc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.299603 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94647b4f-f18c-4010-8573-b36075f21ecc" (UID: "94647b4f-f18c-4010-8573-b36075f21ecc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.329683 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-config-data" (OuterVolumeSpecName: "config-data") pod "94647b4f-f18c-4010-8573-b36075f21ecc" (UID: "94647b4f-f18c-4010-8573-b36075f21ecc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.331184 4867 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.331224 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.331237 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.331254 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.331267 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q5tf\" (UniqueName: \"kubernetes.io/projected/94647b4f-f18c-4010-8573-b36075f21ecc-kube-api-access-8q5tf\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.350960 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "94647b4f-f18c-4010-8573-b36075f21ecc" (UID: "94647b4f-f18c-4010-8573-b36075f21ecc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.358844 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="1179a242-dbf2-4bc1-888b-33f22df356a6" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.147:9292/healthcheck\": read tcp 10.217.0.2:51704->10.217.0.147:9292: read: connection reset by peer" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.358844 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="1179a242-dbf2-4bc1-888b-33f22df356a6" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.147:9292/healthcheck\": read tcp 10.217.0.2:51706->10.217.0.147:9292: read: connection reset by peer" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.377892 4867 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.433207 4867 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.433238 4867 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94647b4f-f18c-4010-8573-b36075f21ecc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.511367 4867 generic.go:334] "Generic (PLEG): container finished" podID="1179a242-dbf2-4bc1-888b-33f22df356a6" containerID="86061cced9fc07099d9d68d5d3403011f3b13c3c5b735f6f29f779c23f098583" exitCode=0 Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.511517 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1179a242-dbf2-4bc1-888b-33f22df356a6","Type":"ContainerDied","Data":"86061cced9fc07099d9d68d5d3403011f3b13c3c5b735f6f29f779c23f098583"} Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.514578 4867 generic.go:334] "Generic (PLEG): container finished" podID="94647b4f-f18c-4010-8573-b36075f21ecc" containerID="e2e2a867e1f7c3c2c7373d2c831ddca3c8248d74e8fb2a26196b9274e44e5b48" exitCode=0 Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.514991 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"94647b4f-f18c-4010-8573-b36075f21ecc","Type":"ContainerDied","Data":"e2e2a867e1f7c3c2c7373d2c831ddca3c8248d74e8fb2a26196b9274e44e5b48"} Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.515095 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"94647b4f-f18c-4010-8573-b36075f21ecc","Type":"ContainerDied","Data":"7ae1dee8772bc91bb3849fd8f65efd07d7b99781063fc5fe55ca9ed2a7cdee89"} Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.515182 4867 scope.go:117] "RemoveContainer" containerID="e2e2a867e1f7c3c2c7373d2c831ddca3c8248d74e8fb2a26196b9274e44e5b48" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.515412 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.526313 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed0ac456-bbf3-4073-96be-6469b1f25a4c","Type":"ContainerStarted","Data":"4d9fd4d7d7cf1b4924c49d0b950c09ea2870da6d7fc4ada0e01b805a0ed0ba68"} Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.527027 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" containerName="ceilometer-central-agent" containerID="cri-o://e5d50f1aa0c7a81125074ee55036cc118072bd9e9e54bcd2e9ffed3aba066a77" gracePeriod=30 Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.527236 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" containerName="proxy-httpd" containerID="cri-o://4d9fd4d7d7cf1b4924c49d0b950c09ea2870da6d7fc4ada0e01b805a0ed0ba68" gracePeriod=30 Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.527318 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" containerName="sg-core" containerID="cri-o://d6567553e8085d8f8993a5e3a9998ec35d1a91810c105af6be9c67718f31a16c" gracePeriod=30 Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.527379 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" containerName="ceilometer-notification-agent" containerID="cri-o://ad2c27abcf450a41455c44a175317eba22ee4d000930026bd261c96b116c861d" gracePeriod=30 Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.527386 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.559171 4867 scope.go:117] "RemoveContainer" containerID="c83a226124ce3e9eb3d129b3bc5a6eec55ad521347829f347ddc31dbabf161a7" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.568780 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.45683529 podStartE2EDuration="6.568757854s" podCreationTimestamp="2025-12-01 09:30:44 +0000 UTC" firstStartedPulling="2025-12-01 09:30:45.419782895 +0000 UTC m=+1366.879169649" lastFinishedPulling="2025-12-01 09:30:49.531705459 +0000 UTC m=+1370.991092213" observedRunningTime="2025-12-01 09:30:50.559579022 +0000 UTC m=+1372.018965776" watchObservedRunningTime="2025-12-01 09:30:50.568757854 +0000 UTC m=+1372.028144598" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.591514 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.608971 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.615357 4867 scope.go:117] "RemoveContainer" containerID="e2e2a867e1f7c3c2c7373d2c831ddca3c8248d74e8fb2a26196b9274e44e5b48" Dec 01 09:30:50 crc kubenswrapper[4867]: E1201 09:30:50.617784 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2e2a867e1f7c3c2c7373d2c831ddca3c8248d74e8fb2a26196b9274e44e5b48\": container with ID starting with e2e2a867e1f7c3c2c7373d2c831ddca3c8248d74e8fb2a26196b9274e44e5b48 not found: ID does not exist" containerID="e2e2a867e1f7c3c2c7373d2c831ddca3c8248d74e8fb2a26196b9274e44e5b48" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.617926 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2e2a867e1f7c3c2c7373d2c831ddca3c8248d74e8fb2a26196b9274e44e5b48"} err="failed to get container status \"e2e2a867e1f7c3c2c7373d2c831ddca3c8248d74e8fb2a26196b9274e44e5b48\": rpc error: code = NotFound desc = could not find container \"e2e2a867e1f7c3c2c7373d2c831ddca3c8248d74e8fb2a26196b9274e44e5b48\": container with ID starting with e2e2a867e1f7c3c2c7373d2c831ddca3c8248d74e8fb2a26196b9274e44e5b48 not found: ID does not exist" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.618015 4867 scope.go:117] "RemoveContainer" containerID="c83a226124ce3e9eb3d129b3bc5a6eec55ad521347829f347ddc31dbabf161a7" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.618849 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 09:30:50 crc kubenswrapper[4867]: E1201 09:30:50.619219 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94647b4f-f18c-4010-8573-b36075f21ecc" containerName="glance-log" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.619236 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="94647b4f-f18c-4010-8573-b36075f21ecc" containerName="glance-log" Dec 01 09:30:50 crc kubenswrapper[4867]: E1201 09:30:50.619273 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94647b4f-f18c-4010-8573-b36075f21ecc" containerName="glance-httpd" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.619280 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="94647b4f-f18c-4010-8573-b36075f21ecc" containerName="glance-httpd" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.619495 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="94647b4f-f18c-4010-8573-b36075f21ecc" containerName="glance-httpd" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.619514 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="94647b4f-f18c-4010-8573-b36075f21ecc" containerName="glance-log" Dec 01 09:30:50 crc kubenswrapper[4867]: E1201 09:30:50.620615 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c83a226124ce3e9eb3d129b3bc5a6eec55ad521347829f347ddc31dbabf161a7\": container with ID starting with c83a226124ce3e9eb3d129b3bc5a6eec55ad521347829f347ddc31dbabf161a7 not found: ID does not exist" containerID="c83a226124ce3e9eb3d129b3bc5a6eec55ad521347829f347ddc31dbabf161a7" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.620686 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c83a226124ce3e9eb3d129b3bc5a6eec55ad521347829f347ddc31dbabf161a7"} err="failed to get container status \"c83a226124ce3e9eb3d129b3bc5a6eec55ad521347829f347ddc31dbabf161a7\": rpc error: code = NotFound desc = could not find container \"c83a226124ce3e9eb3d129b3bc5a6eec55ad521347829f347ddc31dbabf161a7\": container with ID starting with c83a226124ce3e9eb3d129b3bc5a6eec55ad521347829f347ddc31dbabf161a7 not found: ID does not exist" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.621044 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.626755 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.626968 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.646847 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.744993 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b458da1-78cd-4603-936e-e60b83594fad-scripts\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.745061 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b458da1-78cd-4603-936e-e60b83594fad-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.745120 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b458da1-78cd-4603-936e-e60b83594fad-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.745160 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b458da1-78cd-4603-936e-e60b83594fad-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.745192 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b458da1-78cd-4603-936e-e60b83594fad-config-data\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.745207 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl5pm\" (UniqueName: \"kubernetes.io/projected/1b458da1-78cd-4603-936e-e60b83594fad-kube-api-access-tl5pm\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.745251 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.745268 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b458da1-78cd-4603-936e-e60b83594fad-logs\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.846491 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b458da1-78cd-4603-936e-e60b83594fad-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.846897 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b458da1-78cd-4603-936e-e60b83594fad-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.846953 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b458da1-78cd-4603-936e-e60b83594fad-config-data\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.846975 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl5pm\" (UniqueName: \"kubernetes.io/projected/1b458da1-78cd-4603-936e-e60b83594fad-kube-api-access-tl5pm\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.847044 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.847067 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b458da1-78cd-4603-936e-e60b83594fad-logs\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.847105 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b458da1-78cd-4603-936e-e60b83594fad-scripts\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.847151 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b458da1-78cd-4603-936e-e60b83594fad-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.848249 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.848314 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94647b4f-f18c-4010-8573-b36075f21ecc" path="/var/lib/kubelet/pods/94647b4f-f18c-4010-8573-b36075f21ecc/volumes" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.848773 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b458da1-78cd-4603-936e-e60b83594fad-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.849119 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b458da1-78cd-4603-936e-e60b83594fad-logs\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.855549 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b458da1-78cd-4603-936e-e60b83594fad-scripts\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.856470 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b458da1-78cd-4603-936e-e60b83594fad-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.863885 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b458da1-78cd-4603-936e-e60b83594fad-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.867744 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b458da1-78cd-4603-936e-e60b83594fad-config-data\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.871761 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl5pm\" (UniqueName: \"kubernetes.io/projected/1b458da1-78cd-4603-936e-e60b83594fad-kube-api-access-tl5pm\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:50 crc kubenswrapper[4867]: I1201 09:30:50.927939 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"1b458da1-78cd-4603-936e-e60b83594fad\") " pod="openstack/glance-default-external-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.012747 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.019222 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.175132 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1179a242-dbf2-4bc1-888b-33f22df356a6-httpd-run\") pod \"1179a242-dbf2-4bc1-888b-33f22df356a6\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.175616 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-combined-ca-bundle\") pod \"1179a242-dbf2-4bc1-888b-33f22df356a6\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.175646 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-scripts\") pod \"1179a242-dbf2-4bc1-888b-33f22df356a6\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.176102 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1179a242-dbf2-4bc1-888b-33f22df356a6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1179a242-dbf2-4bc1-888b-33f22df356a6" (UID: "1179a242-dbf2-4bc1-888b-33f22df356a6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.176281 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"1179a242-dbf2-4bc1-888b-33f22df356a6\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.176370 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-internal-tls-certs\") pod \"1179a242-dbf2-4bc1-888b-33f22df356a6\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.176723 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1179a242-dbf2-4bc1-888b-33f22df356a6-logs\") pod \"1179a242-dbf2-4bc1-888b-33f22df356a6\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.177648 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxwc4\" (UniqueName: \"kubernetes.io/projected/1179a242-dbf2-4bc1-888b-33f22df356a6-kube-api-access-nxwc4\") pod \"1179a242-dbf2-4bc1-888b-33f22df356a6\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.177717 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-config-data\") pod \"1179a242-dbf2-4bc1-888b-33f22df356a6\" (UID: \"1179a242-dbf2-4bc1-888b-33f22df356a6\") " Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.178674 4867 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1179a242-dbf2-4bc1-888b-33f22df356a6-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.179568 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1179a242-dbf2-4bc1-888b-33f22df356a6-logs" (OuterVolumeSpecName: "logs") pod "1179a242-dbf2-4bc1-888b-33f22df356a6" (UID: "1179a242-dbf2-4bc1-888b-33f22df356a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.184897 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1179a242-dbf2-4bc1-888b-33f22df356a6-kube-api-access-nxwc4" (OuterVolumeSpecName: "kube-api-access-nxwc4") pod "1179a242-dbf2-4bc1-888b-33f22df356a6" (UID: "1179a242-dbf2-4bc1-888b-33f22df356a6"). InnerVolumeSpecName "kube-api-access-nxwc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.189945 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "1179a242-dbf2-4bc1-888b-33f22df356a6" (UID: "1179a242-dbf2-4bc1-888b-33f22df356a6"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.202003 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-scripts" (OuterVolumeSpecName: "scripts") pod "1179a242-dbf2-4bc1-888b-33f22df356a6" (UID: "1179a242-dbf2-4bc1-888b-33f22df356a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.210031 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1179a242-dbf2-4bc1-888b-33f22df356a6" (UID: "1179a242-dbf2-4bc1-888b-33f22df356a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.281254 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.281296 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.281332 4867 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.281347 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1179a242-dbf2-4bc1-888b-33f22df356a6-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.281359 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxwc4\" (UniqueName: \"kubernetes.io/projected/1179a242-dbf2-4bc1-888b-33f22df356a6-kube-api-access-nxwc4\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.410145 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-config-data" (OuterVolumeSpecName: "config-data") pod "1179a242-dbf2-4bc1-888b-33f22df356a6" (UID: "1179a242-dbf2-4bc1-888b-33f22df356a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.419066 4867 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.425924 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1179a242-dbf2-4bc1-888b-33f22df356a6" (UID: "1179a242-dbf2-4bc1-888b-33f22df356a6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.485257 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.485290 4867 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.485302 4867 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1179a242-dbf2-4bc1-888b-33f22df356a6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.540331 4867 generic.go:334] "Generic (PLEG): container finished" podID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" containerID="4d9fd4d7d7cf1b4924c49d0b950c09ea2870da6d7fc4ada0e01b805a0ed0ba68" exitCode=0 Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.540588 4867 generic.go:334] "Generic (PLEG): container finished" podID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" containerID="d6567553e8085d8f8993a5e3a9998ec35d1a91810c105af6be9c67718f31a16c" exitCode=2 Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.540680 4867 generic.go:334] "Generic (PLEG): container finished" podID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" containerID="ad2c27abcf450a41455c44a175317eba22ee4d000930026bd261c96b116c861d" exitCode=0 Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.540827 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed0ac456-bbf3-4073-96be-6469b1f25a4c","Type":"ContainerDied","Data":"4d9fd4d7d7cf1b4924c49d0b950c09ea2870da6d7fc4ada0e01b805a0ed0ba68"} Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.540948 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed0ac456-bbf3-4073-96be-6469b1f25a4c","Type":"ContainerDied","Data":"d6567553e8085d8f8993a5e3a9998ec35d1a91810c105af6be9c67718f31a16c"} Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.541038 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed0ac456-bbf3-4073-96be-6469b1f25a4c","Type":"ContainerDied","Data":"ad2c27abcf450a41455c44a175317eba22ee4d000930026bd261c96b116c861d"} Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.543786 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1179a242-dbf2-4bc1-888b-33f22df356a6","Type":"ContainerDied","Data":"6c5abe63eb3b8ad76da57393c4e4eaade98c513750a7bc5630ca5bbcfd323d8f"} Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.543986 4867 scope.go:117] "RemoveContainer" containerID="86061cced9fc07099d9d68d5d3403011f3b13c3c5b735f6f29f779c23f098583" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.544207 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.545356 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.587295 4867 scope.go:117] "RemoveContainer" containerID="be9c99f1cc0cad5bda870353e0af813dd03a1eacd33fa1fe8d450a2d291b1798" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.631828 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.643131 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.669372 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 09:30:51 crc kubenswrapper[4867]: E1201 09:30:51.669834 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1179a242-dbf2-4bc1-888b-33f22df356a6" containerName="glance-httpd" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.669855 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1179a242-dbf2-4bc1-888b-33f22df356a6" containerName="glance-httpd" Dec 01 09:30:51 crc kubenswrapper[4867]: E1201 09:30:51.669890 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1179a242-dbf2-4bc1-888b-33f22df356a6" containerName="glance-log" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.669898 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1179a242-dbf2-4bc1-888b-33f22df356a6" containerName="glance-log" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.670117 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1179a242-dbf2-4bc1-888b-33f22df356a6" containerName="glance-httpd" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.670145 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1179a242-dbf2-4bc1-888b-33f22df356a6" containerName="glance-log" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.671285 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.675130 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.679642 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.724023 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.797590 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr2c5\" (UniqueName: \"kubernetes.io/projected/5f529607-d9e3-4605-8428-5903a9bab379-kube-api-access-mr2c5\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.797625 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f529607-d9e3-4605-8428-5903a9bab379-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.797643 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f529607-d9e3-4605-8428-5903a9bab379-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.797664 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f529607-d9e3-4605-8428-5903a9bab379-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.797730 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.797748 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f529607-d9e3-4605-8428-5903a9bab379-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.797834 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f529607-d9e3-4605-8428-5903a9bab379-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.797856 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f529607-d9e3-4605-8428-5903a9bab379-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.901952 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f529607-d9e3-4605-8428-5903a9bab379-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.902009 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f529607-d9e3-4605-8428-5903a9bab379-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.902074 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f529607-d9e3-4605-8428-5903a9bab379-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.902100 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr2c5\" (UniqueName: \"kubernetes.io/projected/5f529607-d9e3-4605-8428-5903a9bab379-kube-api-access-mr2c5\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.902123 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f529607-d9e3-4605-8428-5903a9bab379-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.902148 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f529607-d9e3-4605-8428-5903a9bab379-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.902229 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.902260 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f529607-d9e3-4605-8428-5903a9bab379-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.902860 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f529607-d9e3-4605-8428-5903a9bab379-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.905444 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f529607-d9e3-4605-8428-5903a9bab379-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.910649 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: E1201 09:30:51.912056 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1179a242_dbf2_4bc1_888b_33f22df356a6.slice/crio-6c5abe63eb3b8ad76da57393c4e4eaade98c513750a7bc5630ca5bbcfd323d8f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1179a242_dbf2_4bc1_888b_33f22df356a6.slice\": RecentStats: unable to find data in memory cache]" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.931077 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f529607-d9e3-4605-8428-5903a9bab379-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.939680 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f529607-d9e3-4605-8428-5903a9bab379-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.940283 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f529607-d9e3-4605-8428-5903a9bab379-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.940470 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f529607-d9e3-4605-8428-5903a9bab379-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:51 crc kubenswrapper[4867]: I1201 09:30:51.940895 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr2c5\" (UniqueName: \"kubernetes.io/projected/5f529607-d9e3-4605-8428-5903a9bab379-kube-api-access-mr2c5\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:52 crc kubenswrapper[4867]: I1201 09:30:52.060740 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f529607-d9e3-4605-8428-5903a9bab379\") " pod="openstack/glance-default-internal-api-0" Dec 01 09:30:52 crc kubenswrapper[4867]: I1201 09:30:52.312740 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 09:30:52 crc kubenswrapper[4867]: I1201 09:30:52.575974 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b458da1-78cd-4603-936e-e60b83594fad","Type":"ContainerStarted","Data":"bd019b1f2218ee1232366cf481e502b2ace39eb7e973c53b9803d51b7aaf8c28"} Dec 01 09:30:52 crc kubenswrapper[4867]: I1201 09:30:52.576281 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b458da1-78cd-4603-936e-e60b83594fad","Type":"ContainerStarted","Data":"bba344708b18668f9f73abbaed7f4a31f7b1d2e32ec4600bbb7b0e14d297a1bb"} Dec 01 09:30:52 crc kubenswrapper[4867]: I1201 09:30:52.844960 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1179a242-dbf2-4bc1-888b-33f22df356a6" path="/var/lib/kubelet/pods/1179a242-dbf2-4bc1-888b-33f22df356a6/volumes" Dec 01 09:30:53 crc kubenswrapper[4867]: I1201 09:30:53.013373 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 09:30:53 crc kubenswrapper[4867]: W1201 09:30:53.023704 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f529607_d9e3_4605_8428_5903a9bab379.slice/crio-192733ac8fc0b47bbddeb8f30a4d7590f86fb13b36beca44e6b8db2c8a0ba095 WatchSource:0}: Error finding container 192733ac8fc0b47bbddeb8f30a4d7590f86fb13b36beca44e6b8db2c8a0ba095: Status 404 returned error can't find the container with id 192733ac8fc0b47bbddeb8f30a4d7590f86fb13b36beca44e6b8db2c8a0ba095 Dec 01 09:30:53 crc kubenswrapper[4867]: I1201 09:30:53.629236 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b458da1-78cd-4603-936e-e60b83594fad","Type":"ContainerStarted","Data":"7c1a7a0d5ef66ac70135c6228095ecadeb4ab05e7d6abc6a8f4a632c5df75fb7"} Dec 01 09:30:53 crc kubenswrapper[4867]: I1201 09:30:53.632076 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f529607-d9e3-4605-8428-5903a9bab379","Type":"ContainerStarted","Data":"192733ac8fc0b47bbddeb8f30a4d7590f86fb13b36beca44e6b8db2c8a0ba095"} Dec 01 09:30:53 crc kubenswrapper[4867]: I1201 09:30:53.676137 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.676116495 podStartE2EDuration="3.676116495s" podCreationTimestamp="2025-12-01 09:30:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:30:53.668314562 +0000 UTC m=+1375.127701326" watchObservedRunningTime="2025-12-01 09:30:53.676116495 +0000 UTC m=+1375.135503249" Dec 01 09:30:54 crc kubenswrapper[4867]: I1201 09:30:54.647432 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f529607-d9e3-4605-8428-5903a9bab379","Type":"ContainerStarted","Data":"11e3e74264fc9f8b7fb134d6b2d33e19aa7d1890fddcde8d0d943893a42a2723"} Dec 01 09:30:54 crc kubenswrapper[4867]: I1201 09:30:54.647715 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f529607-d9e3-4605-8428-5903a9bab379","Type":"ContainerStarted","Data":"4a1c8e648b058b5cf33f64a6138389fee1fb6381afcca784d8de9e641eec064d"} Dec 01 09:30:54 crc kubenswrapper[4867]: I1201 09:30:54.669315 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.669296295 podStartE2EDuration="3.669296295s" podCreationTimestamp="2025-12-01 09:30:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:30:54.666334064 +0000 UTC m=+1376.125720818" watchObservedRunningTime="2025-12-01 09:30:54.669296295 +0000 UTC m=+1376.128683049" Dec 01 09:30:55 crc kubenswrapper[4867]: I1201 09:30:55.665315 4867 generic.go:334] "Generic (PLEG): container finished" podID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" containerID="e5d50f1aa0c7a81125074ee55036cc118072bd9e9e54bcd2e9ffed3aba066a77" exitCode=0 Dec 01 09:30:55 crc kubenswrapper[4867]: I1201 09:30:55.665497 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed0ac456-bbf3-4073-96be-6469b1f25a4c","Type":"ContainerDied","Data":"e5d50f1aa0c7a81125074ee55036cc118072bd9e9e54bcd2e9ffed3aba066a77"} Dec 01 09:30:55 crc kubenswrapper[4867]: I1201 09:30:55.747761 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:30:55 crc kubenswrapper[4867]: I1201 09:30:55.919318 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-config-data\") pod \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " Dec 01 09:30:55 crc kubenswrapper[4867]: I1201 09:30:55.919429 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-scripts\") pod \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " Dec 01 09:30:55 crc kubenswrapper[4867]: I1201 09:30:55.919481 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed0ac456-bbf3-4073-96be-6469b1f25a4c-run-httpd\") pod \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " Dec 01 09:30:55 crc kubenswrapper[4867]: I1201 09:30:55.919522 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbm9b\" (UniqueName: \"kubernetes.io/projected/ed0ac456-bbf3-4073-96be-6469b1f25a4c-kube-api-access-cbm9b\") pod \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " Dec 01 09:30:55 crc kubenswrapper[4867]: I1201 09:30:55.919613 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-sg-core-conf-yaml\") pod \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " Dec 01 09:30:55 crc kubenswrapper[4867]: I1201 09:30:55.919661 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-combined-ca-bundle\") pod \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " Dec 01 09:30:55 crc kubenswrapper[4867]: I1201 09:30:55.920040 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed0ac456-bbf3-4073-96be-6469b1f25a4c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ed0ac456-bbf3-4073-96be-6469b1f25a4c" (UID: "ed0ac456-bbf3-4073-96be-6469b1f25a4c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:30:55 crc kubenswrapper[4867]: I1201 09:30:55.920390 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed0ac456-bbf3-4073-96be-6469b1f25a4c-log-httpd\") pod \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\" (UID: \"ed0ac456-bbf3-4073-96be-6469b1f25a4c\") " Dec 01 09:30:55 crc kubenswrapper[4867]: I1201 09:30:55.920638 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed0ac456-bbf3-4073-96be-6469b1f25a4c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ed0ac456-bbf3-4073-96be-6469b1f25a4c" (UID: "ed0ac456-bbf3-4073-96be-6469b1f25a4c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:30:55 crc kubenswrapper[4867]: I1201 09:30:55.921603 4867 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed0ac456-bbf3-4073-96be-6469b1f25a4c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:55 crc kubenswrapper[4867]: I1201 09:30:55.921619 4867 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed0ac456-bbf3-4073-96be-6469b1f25a4c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:55 crc kubenswrapper[4867]: I1201 09:30:55.926452 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-scripts" (OuterVolumeSpecName: "scripts") pod "ed0ac456-bbf3-4073-96be-6469b1f25a4c" (UID: "ed0ac456-bbf3-4073-96be-6469b1f25a4c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:55 crc kubenswrapper[4867]: I1201 09:30:55.927591 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed0ac456-bbf3-4073-96be-6469b1f25a4c-kube-api-access-cbm9b" (OuterVolumeSpecName: "kube-api-access-cbm9b") pod "ed0ac456-bbf3-4073-96be-6469b1f25a4c" (UID: "ed0ac456-bbf3-4073-96be-6469b1f25a4c"). InnerVolumeSpecName "kube-api-access-cbm9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:30:55 crc kubenswrapper[4867]: I1201 09:30:55.961753 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ed0ac456-bbf3-4073-96be-6469b1f25a4c" (UID: "ed0ac456-bbf3-4073-96be-6469b1f25a4c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:55 crc kubenswrapper[4867]: I1201 09:30:55.999738 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed0ac456-bbf3-4073-96be-6469b1f25a4c" (UID: "ed0ac456-bbf3-4073-96be-6469b1f25a4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.026880 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.026930 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbm9b\" (UniqueName: \"kubernetes.io/projected/ed0ac456-bbf3-4073-96be-6469b1f25a4c-kube-api-access-cbm9b\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.026945 4867 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.026959 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.047227 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-config-data" (OuterVolumeSpecName: "config-data") pod "ed0ac456-bbf3-4073-96be-6469b1f25a4c" (UID: "ed0ac456-bbf3-4073-96be-6469b1f25a4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.128376 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed0ac456-bbf3-4073-96be-6469b1f25a4c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.679947 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed0ac456-bbf3-4073-96be-6469b1f25a4c","Type":"ContainerDied","Data":"eca7e389536b548d48fa60663354673147d4830111a144a92e865dab88c332c5"} Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.680009 4867 scope.go:117] "RemoveContainer" containerID="4d9fd4d7d7cf1b4924c49d0b950c09ea2870da6d7fc4ada0e01b805a0ed0ba68" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.680222 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.713017 4867 scope.go:117] "RemoveContainer" containerID="d6567553e8085d8f8993a5e3a9998ec35d1a91810c105af6be9c67718f31a16c" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.721411 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.740342 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.750003 4867 scope.go:117] "RemoveContainer" containerID="ad2c27abcf450a41455c44a175317eba22ee4d000930026bd261c96b116c861d" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.753186 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:56 crc kubenswrapper[4867]: E1201 09:30:56.753597 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" containerName="sg-core" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.753621 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" containerName="sg-core" Dec 01 09:30:56 crc kubenswrapper[4867]: E1201 09:30:56.753660 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" containerName="proxy-httpd" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.753669 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" containerName="proxy-httpd" Dec 01 09:30:56 crc kubenswrapper[4867]: E1201 09:30:56.753694 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" containerName="ceilometer-notification-agent" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.753702 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" containerName="ceilometer-notification-agent" Dec 01 09:30:56 crc kubenswrapper[4867]: E1201 09:30:56.753721 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" containerName="ceilometer-central-agent" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.753730 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" containerName="ceilometer-central-agent" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.753988 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" containerName="sg-core" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.754018 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" containerName="ceilometer-notification-agent" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.754040 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" containerName="proxy-httpd" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.754054 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" containerName="ceilometer-central-agent" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.757505 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.762139 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.774681 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.781776 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.795373 4867 scope.go:117] "RemoveContainer" containerID="e5d50f1aa0c7a81125074ee55036cc118072bd9e9e54bcd2e9ffed3aba066a77" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.838856 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed0ac456-bbf3-4073-96be-6469b1f25a4c" path="/var/lib/kubelet/pods/ed0ac456-bbf3-4073-96be-6469b1f25a4c/volumes" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.854500 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1addae43-922a-415a-baa5-b5e8c07c6bc2-log-httpd\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.854560 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1addae43-922a-415a-baa5-b5e8c07c6bc2-run-httpd\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.854590 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.854689 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl9hr\" (UniqueName: \"kubernetes.io/projected/1addae43-922a-415a-baa5-b5e8c07c6bc2-kube-api-access-pl9hr\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.854757 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-config-data\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.854789 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.854896 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-scripts\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.956714 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl9hr\" (UniqueName: \"kubernetes.io/projected/1addae43-922a-415a-baa5-b5e8c07c6bc2-kube-api-access-pl9hr\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.956868 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-config-data\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.956903 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.956945 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-scripts\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.956998 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1addae43-922a-415a-baa5-b5e8c07c6bc2-log-httpd\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.957032 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1addae43-922a-415a-baa5-b5e8c07c6bc2-run-httpd\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.957053 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.957576 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1addae43-922a-415a-baa5-b5e8c07c6bc2-log-httpd\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.959035 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1addae43-922a-415a-baa5-b5e8c07c6bc2-run-httpd\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.966216 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.967906 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-scripts\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.969524 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.974912 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-config-data\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:56 crc kubenswrapper[4867]: I1201 09:30:56.976391 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl9hr\" (UniqueName: \"kubernetes.io/projected/1addae43-922a-415a-baa5-b5e8c07c6bc2-kube-api-access-pl9hr\") pod \"ceilometer-0\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " pod="openstack/ceilometer-0" Dec 01 09:30:57 crc kubenswrapper[4867]: I1201 09:30:57.082718 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:30:57 crc kubenswrapper[4867]: I1201 09:30:57.586187 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:57 crc kubenswrapper[4867]: I1201 09:30:57.690469 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1addae43-922a-415a-baa5-b5e8c07c6bc2","Type":"ContainerStarted","Data":"c4b3c419b5cb2b7b4741f61f800607cd1f6a1851758877aa1f4a05f271776fd7"} Dec 01 09:30:59 crc kubenswrapper[4867]: I1201 09:30:59.377519 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:30:59 crc kubenswrapper[4867]: I1201 09:30:59.737851 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1addae43-922a-415a-baa5-b5e8c07c6bc2","Type":"ContainerStarted","Data":"86ba1866de3444f624f6232d5f299097ddb3a5efe86840e9de2536a1404b33a8"} Dec 01 09:30:59 crc kubenswrapper[4867]: I1201 09:30:59.738206 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1addae43-922a-415a-baa5-b5e8c07c6bc2","Type":"ContainerStarted","Data":"9609f4834621a10ec54480d74f7ccf54ac71aabd1f54306fcbe2fb6a118f9539"} Dec 01 09:31:01 crc kubenswrapper[4867]: I1201 09:31:01.014311 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 09:31:01 crc kubenswrapper[4867]: I1201 09:31:01.014684 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 09:31:01 crc kubenswrapper[4867]: I1201 09:31:01.047895 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 09:31:01 crc kubenswrapper[4867]: I1201 09:31:01.056515 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 09:31:01 crc kubenswrapper[4867]: I1201 09:31:01.752942 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 09:31:01 crc kubenswrapper[4867]: I1201 09:31:01.753208 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 09:31:02 crc kubenswrapper[4867]: I1201 09:31:02.313102 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 09:31:02 crc kubenswrapper[4867]: I1201 09:31:02.314230 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 09:31:02 crc kubenswrapper[4867]: I1201 09:31:02.344349 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 09:31:02 crc kubenswrapper[4867]: I1201 09:31:02.379316 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 09:31:02 crc kubenswrapper[4867]: I1201 09:31:02.762127 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1addae43-922a-415a-baa5-b5e8c07c6bc2","Type":"ContainerStarted","Data":"68aabc6863280923b76d1dfb8d752801744f34a57b519568a28323dcdf563bfd"} Dec 01 09:31:02 crc kubenswrapper[4867]: I1201 09:31:02.762711 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 09:31:02 crc kubenswrapper[4867]: I1201 09:31:02.762829 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 09:31:03 crc kubenswrapper[4867]: I1201 09:31:03.773324 4867 generic.go:334] "Generic (PLEG): container finished" podID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerID="c7ec8779b58f97fafd5134c7e65888047b97ae6e37afffbfa008a76f648c7186" exitCode=137 Dec 01 09:31:03 crc kubenswrapper[4867]: I1201 09:31:03.773708 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c846795f4-k7mlj" event={"ID":"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e","Type":"ContainerDied","Data":"c7ec8779b58f97fafd5134c7e65888047b97ae6e37afffbfa008a76f648c7186"} Dec 01 09:31:03 crc kubenswrapper[4867]: I1201 09:31:03.773749 4867 scope.go:117] "RemoveContainer" containerID="686c5303f0412b7b582b8c491b3e8223fe86fdd2e4836a2991c0f50fae8a3067" Dec 01 09:31:03 crc kubenswrapper[4867]: I1201 09:31:03.778953 4867 generic.go:334] "Generic (PLEG): container finished" podID="8bd4fac2-df2c-4aab-bf00-99b54a83ddca" containerID="1ea30f76a6e3c0a162d6bb9415d525b9c47c036354111f0ab5aa32652d0af895" exitCode=137 Dec 01 09:31:03 crc kubenswrapper[4867]: I1201 09:31:03.779833 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d47c7cb76-srf4p" event={"ID":"8bd4fac2-df2c-4aab-bf00-99b54a83ddca","Type":"ContainerDied","Data":"1ea30f76a6e3c0a162d6bb9415d525b9c47c036354111f0ab5aa32652d0af895"} Dec 01 09:31:03 crc kubenswrapper[4867]: I1201 09:31:03.899495 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 09:31:03 crc kubenswrapper[4867]: I1201 09:31:03.899610 4867 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 09:31:04 crc kubenswrapper[4867]: I1201 09:31:04.062899 4867 scope.go:117] "RemoveContainer" containerID="9d03af7b1362790fa6ac6592121987809cbd79e15e394cba2fd458a1d0946120" Dec 01 09:31:04 crc kubenswrapper[4867]: I1201 09:31:04.149836 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 09:31:04 crc kubenswrapper[4867]: I1201 09:31:04.791236 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c846795f4-k7mlj" event={"ID":"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e","Type":"ContainerStarted","Data":"f8e464b7eb2b25f34de89419d024ce3de918ed146549c279457ce68b63041cf4"} Dec 01 09:31:04 crc kubenswrapper[4867]: I1201 09:31:04.796251 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d47c7cb76-srf4p" event={"ID":"8bd4fac2-df2c-4aab-bf00-99b54a83ddca","Type":"ContainerStarted","Data":"dd6112284bb7d8313dd270af87e2fb7c51f3ce0e4f380b3095595a1a9d42f68e"} Dec 01 09:31:04 crc kubenswrapper[4867]: I1201 09:31:04.799207 4867 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 09:31:04 crc kubenswrapper[4867]: I1201 09:31:04.799237 4867 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 09:31:04 crc kubenswrapper[4867]: I1201 09:31:04.799260 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1addae43-922a-415a-baa5-b5e8c07c6bc2" containerName="proxy-httpd" containerID="cri-o://7e11d93ac3fd451734de9e63c564d2536e3b091c970fefaeff241f6dc34c2fa4" gracePeriod=30 Dec 01 09:31:04 crc kubenswrapper[4867]: I1201 09:31:04.799263 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1addae43-922a-415a-baa5-b5e8c07c6bc2" containerName="sg-core" containerID="cri-o://68aabc6863280923b76d1dfb8d752801744f34a57b519568a28323dcdf563bfd" gracePeriod=30 Dec 01 09:31:04 crc kubenswrapper[4867]: I1201 09:31:04.799330 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1addae43-922a-415a-baa5-b5e8c07c6bc2" containerName="ceilometer-notification-agent" containerID="cri-o://86ba1866de3444f624f6232d5f299097ddb3a5efe86840e9de2536a1404b33a8" gracePeriod=30 Dec 01 09:31:04 crc kubenswrapper[4867]: I1201 09:31:04.799198 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1addae43-922a-415a-baa5-b5e8c07c6bc2" containerName="ceilometer-central-agent" containerID="cri-o://9609f4834621a10ec54480d74f7ccf54ac71aabd1f54306fcbe2fb6a118f9539" gracePeriod=30 Dec 01 09:31:04 crc kubenswrapper[4867]: I1201 09:31:04.799777 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1addae43-922a-415a-baa5-b5e8c07c6bc2","Type":"ContainerStarted","Data":"7e11d93ac3fd451734de9e63c564d2536e3b091c970fefaeff241f6dc34c2fa4"} Dec 01 09:31:04 crc kubenswrapper[4867]: I1201 09:31:04.799832 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 09:31:04 crc kubenswrapper[4867]: I1201 09:31:04.861142 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.07629334 podStartE2EDuration="8.861116246s" podCreationTimestamp="2025-12-01 09:30:56 +0000 UTC" firstStartedPulling="2025-12-01 09:30:57.583411395 +0000 UTC m=+1379.042798149" lastFinishedPulling="2025-12-01 09:31:03.368234311 +0000 UTC m=+1384.827621055" observedRunningTime="2025-12-01 09:31:04.845806255 +0000 UTC m=+1386.305193009" watchObservedRunningTime="2025-12-01 09:31:04.861116246 +0000 UTC m=+1386.320503000" Dec 01 09:31:05 crc kubenswrapper[4867]: I1201 09:31:05.263381 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 09:31:05 crc kubenswrapper[4867]: I1201 09:31:05.264399 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 09:31:05 crc kubenswrapper[4867]: I1201 09:31:05.812294 4867 generic.go:334] "Generic (PLEG): container finished" podID="1addae43-922a-415a-baa5-b5e8c07c6bc2" containerID="7e11d93ac3fd451734de9e63c564d2536e3b091c970fefaeff241f6dc34c2fa4" exitCode=0 Dec 01 09:31:05 crc kubenswrapper[4867]: I1201 09:31:05.812324 4867 generic.go:334] "Generic (PLEG): container finished" podID="1addae43-922a-415a-baa5-b5e8c07c6bc2" containerID="68aabc6863280923b76d1dfb8d752801744f34a57b519568a28323dcdf563bfd" exitCode=2 Dec 01 09:31:05 crc kubenswrapper[4867]: I1201 09:31:05.812333 4867 generic.go:334] "Generic (PLEG): container finished" podID="1addae43-922a-415a-baa5-b5e8c07c6bc2" containerID="86ba1866de3444f624f6232d5f299097ddb3a5efe86840e9de2536a1404b33a8" exitCode=0 Dec 01 09:31:05 crc kubenswrapper[4867]: I1201 09:31:05.812336 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1addae43-922a-415a-baa5-b5e8c07c6bc2","Type":"ContainerDied","Data":"7e11d93ac3fd451734de9e63c564d2536e3b091c970fefaeff241f6dc34c2fa4"} Dec 01 09:31:05 crc kubenswrapper[4867]: I1201 09:31:05.812407 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1addae43-922a-415a-baa5-b5e8c07c6bc2","Type":"ContainerDied","Data":"68aabc6863280923b76d1dfb8d752801744f34a57b519568a28323dcdf563bfd"} Dec 01 09:31:05 crc kubenswrapper[4867]: I1201 09:31:05.812418 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1addae43-922a-415a-baa5-b5e8c07c6bc2","Type":"ContainerDied","Data":"86ba1866de3444f624f6232d5f299097ddb3a5efe86840e9de2536a1404b33a8"} Dec 01 09:31:07 crc kubenswrapper[4867]: I1201 09:31:07.842984 4867 generic.go:334] "Generic (PLEG): container finished" podID="e83471d7-4d9d-427c-b769-bd072acbaae0" containerID="e2f8e9d387cdafacb10858ccb0a575eab5feec4f6cad38bf28df16f55c143b64" exitCode=0 Dec 01 09:31:07 crc kubenswrapper[4867]: I1201 09:31:07.843976 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cdt5g" event={"ID":"e83471d7-4d9d-427c-b769-bd072acbaae0","Type":"ContainerDied","Data":"e2f8e9d387cdafacb10858ccb0a575eab5feec4f6cad38bf28df16f55c143b64"} Dec 01 09:31:08 crc kubenswrapper[4867]: I1201 09:31:08.849288 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:31:08 crc kubenswrapper[4867]: I1201 09:31:08.861080 4867 generic.go:334] "Generic (PLEG): container finished" podID="1addae43-922a-415a-baa5-b5e8c07c6bc2" containerID="9609f4834621a10ec54480d74f7ccf54ac71aabd1f54306fcbe2fb6a118f9539" exitCode=0 Dec 01 09:31:08 crc kubenswrapper[4867]: I1201 09:31:08.861348 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:31:08 crc kubenswrapper[4867]: I1201 09:31:08.861548 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1addae43-922a-415a-baa5-b5e8c07c6bc2","Type":"ContainerDied","Data":"9609f4834621a10ec54480d74f7ccf54ac71aabd1f54306fcbe2fb6a118f9539"} Dec 01 09:31:08 crc kubenswrapper[4867]: I1201 09:31:08.861597 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1addae43-922a-415a-baa5-b5e8c07c6bc2","Type":"ContainerDied","Data":"c4b3c419b5cb2b7b4741f61f800607cd1f6a1851758877aa1f4a05f271776fd7"} Dec 01 09:31:08 crc kubenswrapper[4867]: I1201 09:31:08.861621 4867 scope.go:117] "RemoveContainer" containerID="7e11d93ac3fd451734de9e63c564d2536e3b091c970fefaeff241f6dc34c2fa4" Dec 01 09:31:08 crc kubenswrapper[4867]: I1201 09:31:08.912157 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1addae43-922a-415a-baa5-b5e8c07c6bc2-run-httpd\") pod \"1addae43-922a-415a-baa5-b5e8c07c6bc2\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " Dec 01 09:31:08 crc kubenswrapper[4867]: I1201 09:31:08.912527 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-scripts\") pod \"1addae43-922a-415a-baa5-b5e8c07c6bc2\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " Dec 01 09:31:08 crc kubenswrapper[4867]: I1201 09:31:08.912622 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-config-data\") pod \"1addae43-922a-415a-baa5-b5e8c07c6bc2\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " Dec 01 09:31:08 crc kubenswrapper[4867]: I1201 09:31:08.912701 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl9hr\" (UniqueName: \"kubernetes.io/projected/1addae43-922a-415a-baa5-b5e8c07c6bc2-kube-api-access-pl9hr\") pod \"1addae43-922a-415a-baa5-b5e8c07c6bc2\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " Dec 01 09:31:08 crc kubenswrapper[4867]: I1201 09:31:08.912865 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-sg-core-conf-yaml\") pod \"1addae43-922a-415a-baa5-b5e8c07c6bc2\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " Dec 01 09:31:08 crc kubenswrapper[4867]: I1201 09:31:08.912942 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-combined-ca-bundle\") pod \"1addae43-922a-415a-baa5-b5e8c07c6bc2\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " Dec 01 09:31:08 crc kubenswrapper[4867]: I1201 09:31:08.913064 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1addae43-922a-415a-baa5-b5e8c07c6bc2-log-httpd\") pod \"1addae43-922a-415a-baa5-b5e8c07c6bc2\" (UID: \"1addae43-922a-415a-baa5-b5e8c07c6bc2\") " Dec 01 09:31:08 crc kubenswrapper[4867]: I1201 09:31:08.914621 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1addae43-922a-415a-baa5-b5e8c07c6bc2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1addae43-922a-415a-baa5-b5e8c07c6bc2" (UID: "1addae43-922a-415a-baa5-b5e8c07c6bc2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:31:08 crc kubenswrapper[4867]: I1201 09:31:08.914851 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1addae43-922a-415a-baa5-b5e8c07c6bc2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1addae43-922a-415a-baa5-b5e8c07c6bc2" (UID: "1addae43-922a-415a-baa5-b5e8c07c6bc2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:31:08 crc kubenswrapper[4867]: I1201 09:31:08.920183 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1addae43-922a-415a-baa5-b5e8c07c6bc2-kube-api-access-pl9hr" (OuterVolumeSpecName: "kube-api-access-pl9hr") pod "1addae43-922a-415a-baa5-b5e8c07c6bc2" (UID: "1addae43-922a-415a-baa5-b5e8c07c6bc2"). InnerVolumeSpecName "kube-api-access-pl9hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:31:08 crc kubenswrapper[4867]: I1201 09:31:08.921460 4867 scope.go:117] "RemoveContainer" containerID="68aabc6863280923b76d1dfb8d752801744f34a57b519568a28323dcdf563bfd" Dec 01 09:31:08 crc kubenswrapper[4867]: I1201 09:31:08.922058 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-scripts" (OuterVolumeSpecName: "scripts") pod "1addae43-922a-415a-baa5-b5e8c07c6bc2" (UID: "1addae43-922a-415a-baa5-b5e8c07c6bc2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:08 crc kubenswrapper[4867]: I1201 09:31:08.967948 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1addae43-922a-415a-baa5-b5e8c07c6bc2" (UID: "1addae43-922a-415a-baa5-b5e8c07c6bc2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.015912 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl9hr\" (UniqueName: \"kubernetes.io/projected/1addae43-922a-415a-baa5-b5e8c07c6bc2-kube-api-access-pl9hr\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.015937 4867 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.015947 4867 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1addae43-922a-415a-baa5-b5e8c07c6bc2-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.015955 4867 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1addae43-922a-415a-baa5-b5e8c07c6bc2-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.015962 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.017399 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-config-data" (OuterVolumeSpecName: "config-data") pod "1addae43-922a-415a-baa5-b5e8c07c6bc2" (UID: "1addae43-922a-415a-baa5-b5e8c07c6bc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.029742 4867 scope.go:117] "RemoveContainer" containerID="86ba1866de3444f624f6232d5f299097ddb3a5efe86840e9de2536a1404b33a8" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.071804 4867 scope.go:117] "RemoveContainer" containerID="9609f4834621a10ec54480d74f7ccf54ac71aabd1f54306fcbe2fb6a118f9539" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.074033 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1addae43-922a-415a-baa5-b5e8c07c6bc2" (UID: "1addae43-922a-415a-baa5-b5e8c07c6bc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.106397 4867 scope.go:117] "RemoveContainer" containerID="7e11d93ac3fd451734de9e63c564d2536e3b091c970fefaeff241f6dc34c2fa4" Dec 01 09:31:09 crc kubenswrapper[4867]: E1201 09:31:09.109178 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e11d93ac3fd451734de9e63c564d2536e3b091c970fefaeff241f6dc34c2fa4\": container with ID starting with 7e11d93ac3fd451734de9e63c564d2536e3b091c970fefaeff241f6dc34c2fa4 not found: ID does not exist" containerID="7e11d93ac3fd451734de9e63c564d2536e3b091c970fefaeff241f6dc34c2fa4" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.109228 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e11d93ac3fd451734de9e63c564d2536e3b091c970fefaeff241f6dc34c2fa4"} err="failed to get container status \"7e11d93ac3fd451734de9e63c564d2536e3b091c970fefaeff241f6dc34c2fa4\": rpc error: code = NotFound desc = could not find container \"7e11d93ac3fd451734de9e63c564d2536e3b091c970fefaeff241f6dc34c2fa4\": container with ID starting with 7e11d93ac3fd451734de9e63c564d2536e3b091c970fefaeff241f6dc34c2fa4 not found: ID does not exist" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.109255 4867 scope.go:117] "RemoveContainer" containerID="68aabc6863280923b76d1dfb8d752801744f34a57b519568a28323dcdf563bfd" Dec 01 09:31:09 crc kubenswrapper[4867]: E1201 09:31:09.109637 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68aabc6863280923b76d1dfb8d752801744f34a57b519568a28323dcdf563bfd\": container with ID starting with 68aabc6863280923b76d1dfb8d752801744f34a57b519568a28323dcdf563bfd not found: ID does not exist" containerID="68aabc6863280923b76d1dfb8d752801744f34a57b519568a28323dcdf563bfd" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.109670 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68aabc6863280923b76d1dfb8d752801744f34a57b519568a28323dcdf563bfd"} err="failed to get container status \"68aabc6863280923b76d1dfb8d752801744f34a57b519568a28323dcdf563bfd\": rpc error: code = NotFound desc = could not find container \"68aabc6863280923b76d1dfb8d752801744f34a57b519568a28323dcdf563bfd\": container with ID starting with 68aabc6863280923b76d1dfb8d752801744f34a57b519568a28323dcdf563bfd not found: ID does not exist" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.109695 4867 scope.go:117] "RemoveContainer" containerID="86ba1866de3444f624f6232d5f299097ddb3a5efe86840e9de2536a1404b33a8" Dec 01 09:31:09 crc kubenswrapper[4867]: E1201 09:31:09.112994 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86ba1866de3444f624f6232d5f299097ddb3a5efe86840e9de2536a1404b33a8\": container with ID starting with 86ba1866de3444f624f6232d5f299097ddb3a5efe86840e9de2536a1404b33a8 not found: ID does not exist" containerID="86ba1866de3444f624f6232d5f299097ddb3a5efe86840e9de2536a1404b33a8" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.113051 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86ba1866de3444f624f6232d5f299097ddb3a5efe86840e9de2536a1404b33a8"} err="failed to get container status \"86ba1866de3444f624f6232d5f299097ddb3a5efe86840e9de2536a1404b33a8\": rpc error: code = NotFound desc = could not find container \"86ba1866de3444f624f6232d5f299097ddb3a5efe86840e9de2536a1404b33a8\": container with ID starting with 86ba1866de3444f624f6232d5f299097ddb3a5efe86840e9de2536a1404b33a8 not found: ID does not exist" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.113079 4867 scope.go:117] "RemoveContainer" containerID="9609f4834621a10ec54480d74f7ccf54ac71aabd1f54306fcbe2fb6a118f9539" Dec 01 09:31:09 crc kubenswrapper[4867]: E1201 09:31:09.113418 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9609f4834621a10ec54480d74f7ccf54ac71aabd1f54306fcbe2fb6a118f9539\": container with ID starting with 9609f4834621a10ec54480d74f7ccf54ac71aabd1f54306fcbe2fb6a118f9539 not found: ID does not exist" containerID="9609f4834621a10ec54480d74f7ccf54ac71aabd1f54306fcbe2fb6a118f9539" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.113477 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9609f4834621a10ec54480d74f7ccf54ac71aabd1f54306fcbe2fb6a118f9539"} err="failed to get container status \"9609f4834621a10ec54480d74f7ccf54ac71aabd1f54306fcbe2fb6a118f9539\": rpc error: code = NotFound desc = could not find container \"9609f4834621a10ec54480d74f7ccf54ac71aabd1f54306fcbe2fb6a118f9539\": container with ID starting with 9609f4834621a10ec54480d74f7ccf54ac71aabd1f54306fcbe2fb6a118f9539 not found: ID does not exist" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.117704 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.117739 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1addae43-922a-415a-baa5-b5e8c07c6bc2-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.157728 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cdt5g" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.203185 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.218643 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fml89\" (UniqueName: \"kubernetes.io/projected/e83471d7-4d9d-427c-b769-bd072acbaae0-kube-api-access-fml89\") pod \"e83471d7-4d9d-427c-b769-bd072acbaae0\" (UID: \"e83471d7-4d9d-427c-b769-bd072acbaae0\") " Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.218698 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83471d7-4d9d-427c-b769-bd072acbaae0-combined-ca-bundle\") pod \"e83471d7-4d9d-427c-b769-bd072acbaae0\" (UID: \"e83471d7-4d9d-427c-b769-bd072acbaae0\") " Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.218774 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83471d7-4d9d-427c-b769-bd072acbaae0-config-data\") pod \"e83471d7-4d9d-427c-b769-bd072acbaae0\" (UID: \"e83471d7-4d9d-427c-b769-bd072acbaae0\") " Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.218853 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83471d7-4d9d-427c-b769-bd072acbaae0-scripts\") pod \"e83471d7-4d9d-427c-b769-bd072acbaae0\" (UID: \"e83471d7-4d9d-427c-b769-bd072acbaae0\") " Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.226568 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.233111 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e83471d7-4d9d-427c-b769-bd072acbaae0-kube-api-access-fml89" (OuterVolumeSpecName: "kube-api-access-fml89") pod "e83471d7-4d9d-427c-b769-bd072acbaae0" (UID: "e83471d7-4d9d-427c-b769-bd072acbaae0"). InnerVolumeSpecName "kube-api-access-fml89". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.240120 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83471d7-4d9d-427c-b769-bd072acbaae0-scripts" (OuterVolumeSpecName: "scripts") pod "e83471d7-4d9d-427c-b769-bd072acbaae0" (UID: "e83471d7-4d9d-427c-b769-bd072acbaae0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.248204 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:31:09 crc kubenswrapper[4867]: E1201 09:31:09.248586 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e83471d7-4d9d-427c-b769-bd072acbaae0" containerName="nova-cell0-conductor-db-sync" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.248607 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e83471d7-4d9d-427c-b769-bd072acbaae0" containerName="nova-cell0-conductor-db-sync" Dec 01 09:31:09 crc kubenswrapper[4867]: E1201 09:31:09.248640 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1addae43-922a-415a-baa5-b5e8c07c6bc2" containerName="ceilometer-central-agent" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.248648 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1addae43-922a-415a-baa5-b5e8c07c6bc2" containerName="ceilometer-central-agent" Dec 01 09:31:09 crc kubenswrapper[4867]: E1201 09:31:09.248667 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1addae43-922a-415a-baa5-b5e8c07c6bc2" containerName="sg-core" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.248676 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1addae43-922a-415a-baa5-b5e8c07c6bc2" containerName="sg-core" Dec 01 09:31:09 crc kubenswrapper[4867]: E1201 09:31:09.248697 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1addae43-922a-415a-baa5-b5e8c07c6bc2" containerName="proxy-httpd" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.248706 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1addae43-922a-415a-baa5-b5e8c07c6bc2" containerName="proxy-httpd" Dec 01 09:31:09 crc kubenswrapper[4867]: E1201 09:31:09.248724 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1addae43-922a-415a-baa5-b5e8c07c6bc2" containerName="ceilometer-notification-agent" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.248732 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1addae43-922a-415a-baa5-b5e8c07c6bc2" containerName="ceilometer-notification-agent" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.249106 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1addae43-922a-415a-baa5-b5e8c07c6bc2" containerName="proxy-httpd" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.249136 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1addae43-922a-415a-baa5-b5e8c07c6bc2" containerName="ceilometer-notification-agent" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.249156 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1addae43-922a-415a-baa5-b5e8c07c6bc2" containerName="ceilometer-central-agent" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.249165 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e83471d7-4d9d-427c-b769-bd072acbaae0" containerName="nova-cell0-conductor-db-sync" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.249178 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1addae43-922a-415a-baa5-b5e8c07c6bc2" containerName="sg-core" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.250801 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.254147 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.254233 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.260851 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.276459 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83471d7-4d9d-427c-b769-bd072acbaae0-config-data" (OuterVolumeSpecName: "config-data") pod "e83471d7-4d9d-427c-b769-bd072acbaae0" (UID: "e83471d7-4d9d-427c-b769-bd072acbaae0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.288431 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83471d7-4d9d-427c-b769-bd072acbaae0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e83471d7-4d9d-427c-b769-bd072acbaae0" (UID: "e83471d7-4d9d-427c-b769-bd072acbaae0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.320893 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4m7w\" (UniqueName: \"kubernetes.io/projected/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-kube-api-access-p4m7w\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.321127 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.321175 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-log-httpd\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.321199 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.321218 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-run-httpd\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.321235 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-config-data\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.321259 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-scripts\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.321365 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fml89\" (UniqueName: \"kubernetes.io/projected/e83471d7-4d9d-427c-b769-bd072acbaae0-kube-api-access-fml89\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.321375 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83471d7-4d9d-427c-b769-bd072acbaae0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.321385 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83471d7-4d9d-427c-b769-bd072acbaae0-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.321395 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83471d7-4d9d-427c-b769-bd072acbaae0-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.423271 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4m7w\" (UniqueName: \"kubernetes.io/projected/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-kube-api-access-p4m7w\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.423335 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.423371 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-log-httpd\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.423397 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.423422 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-run-httpd\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.423455 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-config-data\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.423485 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-scripts\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.423891 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-log-httpd\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.424211 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-run-httpd\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.429011 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.429573 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-scripts\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.430015 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-config-data\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.433325 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.440844 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4m7w\" (UniqueName: \"kubernetes.io/projected/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-kube-api-access-p4m7w\") pod \"ceilometer-0\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.572958 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.874521 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cdt5g" event={"ID":"e83471d7-4d9d-427c-b769-bd072acbaae0","Type":"ContainerDied","Data":"f84aeb93a79a6a9067c2ec22c426a95f0a177b3b16005ffb912726368c59e765"} Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.874564 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f84aeb93a79a6a9067c2ec22c426a95f0a177b3b16005ffb912726368c59e765" Dec 01 09:31:09 crc kubenswrapper[4867]: I1201 09:31:09.874626 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cdt5g" Dec 01 09:31:10 crc kubenswrapper[4867]: I1201 09:31:10.013240 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 09:31:10 crc kubenswrapper[4867]: I1201 09:31:10.021628 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 09:31:10 crc kubenswrapper[4867]: I1201 09:31:10.027195 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-cbblx" Dec 01 09:31:10 crc kubenswrapper[4867]: I1201 09:31:10.027415 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 09:31:10 crc kubenswrapper[4867]: I1201 09:31:10.033513 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3030542c-dee9-40e5-af75-53a0bbc22301-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3030542c-dee9-40e5-af75-53a0bbc22301\") " pod="openstack/nova-cell0-conductor-0" Dec 01 09:31:10 crc kubenswrapper[4867]: I1201 09:31:10.033603 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76rjr\" (UniqueName: \"kubernetes.io/projected/3030542c-dee9-40e5-af75-53a0bbc22301-kube-api-access-76rjr\") pod \"nova-cell0-conductor-0\" (UID: \"3030542c-dee9-40e5-af75-53a0bbc22301\") " pod="openstack/nova-cell0-conductor-0" Dec 01 09:31:10 crc kubenswrapper[4867]: I1201 09:31:10.033624 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3030542c-dee9-40e5-af75-53a0bbc22301-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3030542c-dee9-40e5-af75-53a0bbc22301\") " pod="openstack/nova-cell0-conductor-0" Dec 01 09:31:10 crc kubenswrapper[4867]: I1201 09:31:10.036189 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 09:31:10 crc kubenswrapper[4867]: I1201 09:31:10.053250 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:31:10 crc kubenswrapper[4867]: I1201 09:31:10.134734 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3030542c-dee9-40e5-af75-53a0bbc22301-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3030542c-dee9-40e5-af75-53a0bbc22301\") " pod="openstack/nova-cell0-conductor-0" Dec 01 09:31:10 crc kubenswrapper[4867]: I1201 09:31:10.135119 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76rjr\" (UniqueName: \"kubernetes.io/projected/3030542c-dee9-40e5-af75-53a0bbc22301-kube-api-access-76rjr\") pod \"nova-cell0-conductor-0\" (UID: \"3030542c-dee9-40e5-af75-53a0bbc22301\") " pod="openstack/nova-cell0-conductor-0" Dec 01 09:31:10 crc kubenswrapper[4867]: I1201 09:31:10.135141 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3030542c-dee9-40e5-af75-53a0bbc22301-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3030542c-dee9-40e5-af75-53a0bbc22301\") " pod="openstack/nova-cell0-conductor-0" Dec 01 09:31:10 crc kubenswrapper[4867]: I1201 09:31:10.141292 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3030542c-dee9-40e5-af75-53a0bbc22301-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3030542c-dee9-40e5-af75-53a0bbc22301\") " pod="openstack/nova-cell0-conductor-0" Dec 01 09:31:10 crc kubenswrapper[4867]: I1201 09:31:10.141477 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3030542c-dee9-40e5-af75-53a0bbc22301-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3030542c-dee9-40e5-af75-53a0bbc22301\") " pod="openstack/nova-cell0-conductor-0" Dec 01 09:31:10 crc kubenswrapper[4867]: I1201 09:31:10.156074 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76rjr\" (UniqueName: \"kubernetes.io/projected/3030542c-dee9-40e5-af75-53a0bbc22301-kube-api-access-76rjr\") pod \"nova-cell0-conductor-0\" (UID: \"3030542c-dee9-40e5-af75-53a0bbc22301\") " pod="openstack/nova-cell0-conductor-0" Dec 01 09:31:10 crc kubenswrapper[4867]: I1201 09:31:10.342772 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 09:31:10 crc kubenswrapper[4867]: I1201 09:31:10.855502 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1addae43-922a-415a-baa5-b5e8c07c6bc2" path="/var/lib/kubelet/pods/1addae43-922a-415a-baa5-b5e8c07c6bc2/volumes" Dec 01 09:31:10 crc kubenswrapper[4867]: I1201 09:31:10.887917 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b12b9b87-3dcb-4b41-b66f-9cda88c6d921","Type":"ContainerStarted","Data":"e133893098e3230a737ddfab4733a36f6c1046070857ac121591b26b7c993bcb"} Dec 01 09:31:10 crc kubenswrapper[4867]: I1201 09:31:10.892453 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 09:31:11 crc kubenswrapper[4867]: I1201 09:31:11.897630 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3030542c-dee9-40e5-af75-53a0bbc22301","Type":"ContainerStarted","Data":"9c9881738ff798b314ceaa969ed43cc13d5b1268b26e52970c169d90b5974f29"} Dec 01 09:31:11 crc kubenswrapper[4867]: I1201 09:31:11.897994 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 01 09:31:11 crc kubenswrapper[4867]: I1201 09:31:11.898008 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3030542c-dee9-40e5-af75-53a0bbc22301","Type":"ContainerStarted","Data":"fed57a258ce429bfb513f176419a04acd53a7e9ebe9c4a4554b43a142a6a04c4"} Dec 01 09:31:11 crc kubenswrapper[4867]: I1201 09:31:11.899399 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b12b9b87-3dcb-4b41-b66f-9cda88c6d921","Type":"ContainerStarted","Data":"cf77186aa773bcab3bf2c79ec6ea40ad2f19edf5f3aaef1706e4acedad6b86ed"} Dec 01 09:31:11 crc kubenswrapper[4867]: I1201 09:31:11.926396 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.926337936 podStartE2EDuration="2.926337936s" podCreationTimestamp="2025-12-01 09:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:31:11.922995824 +0000 UTC m=+1393.382382598" watchObservedRunningTime="2025-12-01 09:31:11.926337936 +0000 UTC m=+1393.385724690" Dec 01 09:31:12 crc kubenswrapper[4867]: I1201 09:31:12.910032 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:31:12 crc kubenswrapper[4867]: I1201 09:31:12.910657 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:31:12 crc kubenswrapper[4867]: I1201 09:31:12.931229 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b12b9b87-3dcb-4b41-b66f-9cda88c6d921","Type":"ContainerStarted","Data":"64f000c48da56ecc2fc797eaa871bda89e39c52aed740d50d8b8f28fd88931d7"} Dec 01 09:31:13 crc kubenswrapper[4867]: I1201 09:31:13.293092 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:31:13 crc kubenswrapper[4867]: I1201 09:31:13.293152 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:31:13 crc kubenswrapper[4867]: I1201 09:31:13.621887 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h9tkh"] Dec 01 09:31:13 crc kubenswrapper[4867]: I1201 09:31:13.623636 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h9tkh" Dec 01 09:31:13 crc kubenswrapper[4867]: I1201 09:31:13.647993 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h9tkh"] Dec 01 09:31:13 crc kubenswrapper[4867]: I1201 09:31:13.707237 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0cf63e8-e586-4cac-990d-43c976b30366-catalog-content\") pod \"redhat-operators-h9tkh\" (UID: \"b0cf63e8-e586-4cac-990d-43c976b30366\") " pod="openshift-marketplace/redhat-operators-h9tkh" Dec 01 09:31:13 crc kubenswrapper[4867]: I1201 09:31:13.707341 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q768s\" (UniqueName: \"kubernetes.io/projected/b0cf63e8-e586-4cac-990d-43c976b30366-kube-api-access-q768s\") pod \"redhat-operators-h9tkh\" (UID: \"b0cf63e8-e586-4cac-990d-43c976b30366\") " pod="openshift-marketplace/redhat-operators-h9tkh" Dec 01 09:31:13 crc kubenswrapper[4867]: I1201 09:31:13.707372 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0cf63e8-e586-4cac-990d-43c976b30366-utilities\") pod \"redhat-operators-h9tkh\" (UID: \"b0cf63e8-e586-4cac-990d-43c976b30366\") " pod="openshift-marketplace/redhat-operators-h9tkh" Dec 01 09:31:13 crc kubenswrapper[4867]: I1201 09:31:13.809162 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0cf63e8-e586-4cac-990d-43c976b30366-catalog-content\") pod \"redhat-operators-h9tkh\" (UID: \"b0cf63e8-e586-4cac-990d-43c976b30366\") " pod="openshift-marketplace/redhat-operators-h9tkh" Dec 01 09:31:13 crc kubenswrapper[4867]: I1201 09:31:13.809290 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q768s\" (UniqueName: \"kubernetes.io/projected/b0cf63e8-e586-4cac-990d-43c976b30366-kube-api-access-q768s\") pod \"redhat-operators-h9tkh\" (UID: \"b0cf63e8-e586-4cac-990d-43c976b30366\") " pod="openshift-marketplace/redhat-operators-h9tkh" Dec 01 09:31:13 crc kubenswrapper[4867]: I1201 09:31:13.809323 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0cf63e8-e586-4cac-990d-43c976b30366-utilities\") pod \"redhat-operators-h9tkh\" (UID: \"b0cf63e8-e586-4cac-990d-43c976b30366\") " pod="openshift-marketplace/redhat-operators-h9tkh" Dec 01 09:31:13 crc kubenswrapper[4867]: I1201 09:31:13.809752 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0cf63e8-e586-4cac-990d-43c976b30366-catalog-content\") pod \"redhat-operators-h9tkh\" (UID: \"b0cf63e8-e586-4cac-990d-43c976b30366\") " pod="openshift-marketplace/redhat-operators-h9tkh" Dec 01 09:31:13 crc kubenswrapper[4867]: I1201 09:31:13.809872 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0cf63e8-e586-4cac-990d-43c976b30366-utilities\") pod \"redhat-operators-h9tkh\" (UID: \"b0cf63e8-e586-4cac-990d-43c976b30366\") " pod="openshift-marketplace/redhat-operators-h9tkh" Dec 01 09:31:13 crc kubenswrapper[4867]: I1201 09:31:13.841802 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q768s\" (UniqueName: \"kubernetes.io/projected/b0cf63e8-e586-4cac-990d-43c976b30366-kube-api-access-q768s\") pod \"redhat-operators-h9tkh\" (UID: \"b0cf63e8-e586-4cac-990d-43c976b30366\") " pod="openshift-marketplace/redhat-operators-h9tkh" Dec 01 09:31:13 crc kubenswrapper[4867]: I1201 09:31:13.947260 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h9tkh" Dec 01 09:31:14 crc kubenswrapper[4867]: I1201 09:31:14.455849 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h9tkh"] Dec 01 09:31:14 crc kubenswrapper[4867]: W1201 09:31:14.458210 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0cf63e8_e586_4cac_990d_43c976b30366.slice/crio-9b4126d358320e01c3ca3ce3e5e60bde9ca5e7d0d333ade7e2d31e69977c1757 WatchSource:0}: Error finding container 9b4126d358320e01c3ca3ce3e5e60bde9ca5e7d0d333ade7e2d31e69977c1757: Status 404 returned error can't find the container with id 9b4126d358320e01c3ca3ce3e5e60bde9ca5e7d0d333ade7e2d31e69977c1757 Dec 01 09:31:14 crc kubenswrapper[4867]: I1201 09:31:14.950164 4867 generic.go:334] "Generic (PLEG): container finished" podID="b0cf63e8-e586-4cac-990d-43c976b30366" containerID="50138792c8328a73ac91b25e33642929bf7d8234445a538a22e7701e588a94ab" exitCode=0 Dec 01 09:31:14 crc kubenswrapper[4867]: I1201 09:31:14.950357 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h9tkh" event={"ID":"b0cf63e8-e586-4cac-990d-43c976b30366","Type":"ContainerDied","Data":"50138792c8328a73ac91b25e33642929bf7d8234445a538a22e7701e588a94ab"} Dec 01 09:31:14 crc kubenswrapper[4867]: I1201 09:31:14.950437 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h9tkh" event={"ID":"b0cf63e8-e586-4cac-990d-43c976b30366","Type":"ContainerStarted","Data":"9b4126d358320e01c3ca3ce3e5e60bde9ca5e7d0d333ade7e2d31e69977c1757"} Dec 01 09:31:14 crc kubenswrapper[4867]: I1201 09:31:14.956967 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b12b9b87-3dcb-4b41-b66f-9cda88c6d921","Type":"ContainerStarted","Data":"ad06385dd1bb98a0d533ddc771da744c4662aaa6f19f1c612fa6ca360bb51354"} Dec 01 09:31:15 crc kubenswrapper[4867]: I1201 09:31:15.971194 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h9tkh" event={"ID":"b0cf63e8-e586-4cac-990d-43c976b30366","Type":"ContainerStarted","Data":"751f6eefe0bf1acd98b88c85b62efbb078278ac783d65255d683739446fec959"} Dec 01 09:31:16 crc kubenswrapper[4867]: I1201 09:31:16.997296 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b12b9b87-3dcb-4b41-b66f-9cda88c6d921","Type":"ContainerStarted","Data":"da9bc2922244a660217e6317aaa2a120ab01c97bca5a859e5bde4da85234edc0"} Dec 01 09:31:16 crc kubenswrapper[4867]: I1201 09:31:16.997731 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 09:31:17 crc kubenswrapper[4867]: I1201 09:31:17.024311 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.340525745 podStartE2EDuration="8.024292423s" podCreationTimestamp="2025-12-01 09:31:09 +0000 UTC" firstStartedPulling="2025-12-01 09:31:10.075524443 +0000 UTC m=+1391.534911197" lastFinishedPulling="2025-12-01 09:31:15.759291121 +0000 UTC m=+1397.218677875" observedRunningTime="2025-12-01 09:31:17.023776469 +0000 UTC m=+1398.483163233" watchObservedRunningTime="2025-12-01 09:31:17.024292423 +0000 UTC m=+1398.483679177" Dec 01 09:31:20 crc kubenswrapper[4867]: I1201 09:31:20.372955 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 01 09:31:20 crc kubenswrapper[4867]: I1201 09:31:20.943408 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-zvrlk"] Dec 01 09:31:20 crc kubenswrapper[4867]: I1201 09:31:20.944939 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-zvrlk" Dec 01 09:31:20 crc kubenswrapper[4867]: I1201 09:31:20.947732 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 01 09:31:20 crc kubenswrapper[4867]: I1201 09:31:20.947946 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 01 09:31:20 crc kubenswrapper[4867]: I1201 09:31:20.967023 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-zvrlk"] Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.033189 4867 generic.go:334] "Generic (PLEG): container finished" podID="b0cf63e8-e586-4cac-990d-43c976b30366" containerID="751f6eefe0bf1acd98b88c85b62efbb078278ac783d65255d683739446fec959" exitCode=0 Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.033416 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h9tkh" event={"ID":"b0cf63e8-e586-4cac-990d-43c976b30366","Type":"ContainerDied","Data":"751f6eefe0bf1acd98b88c85b62efbb078278ac783d65255d683739446fec959"} Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.153788 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4cb4881-0b4e-4085-9627-1efc85a5efaa-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-zvrlk\" (UID: \"d4cb4881-0b4e-4085-9627-1efc85a5efaa\") " pod="openstack/nova-cell0-cell-mapping-zvrlk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.153968 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzmsw\" (UniqueName: \"kubernetes.io/projected/d4cb4881-0b4e-4085-9627-1efc85a5efaa-kube-api-access-jzmsw\") pod \"nova-cell0-cell-mapping-zvrlk\" (UID: \"d4cb4881-0b4e-4085-9627-1efc85a5efaa\") " pod="openstack/nova-cell0-cell-mapping-zvrlk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.154080 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4cb4881-0b4e-4085-9627-1efc85a5efaa-scripts\") pod \"nova-cell0-cell-mapping-zvrlk\" (UID: \"d4cb4881-0b4e-4085-9627-1efc85a5efaa\") " pod="openstack/nova-cell0-cell-mapping-zvrlk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.154166 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4cb4881-0b4e-4085-9627-1efc85a5efaa-config-data\") pod \"nova-cell0-cell-mapping-zvrlk\" (UID: \"d4cb4881-0b4e-4085-9627-1efc85a5efaa\") " pod="openstack/nova-cell0-cell-mapping-zvrlk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.171120 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.172655 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.176771 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.190919 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.222023 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.223292 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.225793 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.245932 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.256669 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4cb4881-0b4e-4085-9627-1efc85a5efaa-scripts\") pod \"nova-cell0-cell-mapping-zvrlk\" (UID: \"d4cb4881-0b4e-4085-9627-1efc85a5efaa\") " pod="openstack/nova-cell0-cell-mapping-zvrlk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.256753 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drttj\" (UniqueName: \"kubernetes.io/projected/5595c9f3-b2c1-4104-8752-eb39efd14d1c-kube-api-access-drttj\") pod \"nova-api-0\" (UID: \"5595c9f3-b2c1-4104-8752-eb39efd14d1c\") " pod="openstack/nova-api-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.256832 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4cb4881-0b4e-4085-9627-1efc85a5efaa-config-data\") pod \"nova-cell0-cell-mapping-zvrlk\" (UID: \"d4cb4881-0b4e-4085-9627-1efc85a5efaa\") " pod="openstack/nova-cell0-cell-mapping-zvrlk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.256864 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5595c9f3-b2c1-4104-8752-eb39efd14d1c-logs\") pod \"nova-api-0\" (UID: \"5595c9f3-b2c1-4104-8752-eb39efd14d1c\") " pod="openstack/nova-api-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.256895 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4cb4881-0b4e-4085-9627-1efc85a5efaa-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-zvrlk\" (UID: \"d4cb4881-0b4e-4085-9627-1efc85a5efaa\") " pod="openstack/nova-cell0-cell-mapping-zvrlk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.256922 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5595c9f3-b2c1-4104-8752-eb39efd14d1c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5595c9f3-b2c1-4104-8752-eb39efd14d1c\") " pod="openstack/nova-api-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.257029 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5595c9f3-b2c1-4104-8752-eb39efd14d1c-config-data\") pod \"nova-api-0\" (UID: \"5595c9f3-b2c1-4104-8752-eb39efd14d1c\") " pod="openstack/nova-api-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.257060 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzmsw\" (UniqueName: \"kubernetes.io/projected/d4cb4881-0b4e-4085-9627-1efc85a5efaa-kube-api-access-jzmsw\") pod \"nova-cell0-cell-mapping-zvrlk\" (UID: \"d4cb4881-0b4e-4085-9627-1efc85a5efaa\") " pod="openstack/nova-cell0-cell-mapping-zvrlk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.275967 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4cb4881-0b4e-4085-9627-1efc85a5efaa-config-data\") pod \"nova-cell0-cell-mapping-zvrlk\" (UID: \"d4cb4881-0b4e-4085-9627-1efc85a5efaa\") " pod="openstack/nova-cell0-cell-mapping-zvrlk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.281856 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4cb4881-0b4e-4085-9627-1efc85a5efaa-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-zvrlk\" (UID: \"d4cb4881-0b4e-4085-9627-1efc85a5efaa\") " pod="openstack/nova-cell0-cell-mapping-zvrlk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.282846 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4cb4881-0b4e-4085-9627-1efc85a5efaa-scripts\") pod \"nova-cell0-cell-mapping-zvrlk\" (UID: \"d4cb4881-0b4e-4085-9627-1efc85a5efaa\") " pod="openstack/nova-cell0-cell-mapping-zvrlk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.310687 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzmsw\" (UniqueName: \"kubernetes.io/projected/d4cb4881-0b4e-4085-9627-1efc85a5efaa-kube-api-access-jzmsw\") pod \"nova-cell0-cell-mapping-zvrlk\" (UID: \"d4cb4881-0b4e-4085-9627-1efc85a5efaa\") " pod="openstack/nova-cell0-cell-mapping-zvrlk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.358358 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jw7z\" (UniqueName: \"kubernetes.io/projected/e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e-kube-api-access-9jw7z\") pod \"nova-cell1-novncproxy-0\" (UID: \"e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.358427 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5595c9f3-b2c1-4104-8752-eb39efd14d1c-logs\") pod \"nova-api-0\" (UID: \"5595c9f3-b2c1-4104-8752-eb39efd14d1c\") " pod="openstack/nova-api-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.358458 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5595c9f3-b2c1-4104-8752-eb39efd14d1c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5595c9f3-b2c1-4104-8752-eb39efd14d1c\") " pod="openstack/nova-api-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.358497 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5595c9f3-b2c1-4104-8752-eb39efd14d1c-config-data\") pod \"nova-api-0\" (UID: \"5595c9f3-b2c1-4104-8752-eb39efd14d1c\") " pod="openstack/nova-api-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.358516 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.358556 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.358624 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drttj\" (UniqueName: \"kubernetes.io/projected/5595c9f3-b2c1-4104-8752-eb39efd14d1c-kube-api-access-drttj\") pod \"nova-api-0\" (UID: \"5595c9f3-b2c1-4104-8752-eb39efd14d1c\") " pod="openstack/nova-api-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.359001 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5595c9f3-b2c1-4104-8752-eb39efd14d1c-logs\") pod \"nova-api-0\" (UID: \"5595c9f3-b2c1-4104-8752-eb39efd14d1c\") " pod="openstack/nova-api-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.365361 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5595c9f3-b2c1-4104-8752-eb39efd14d1c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5595c9f3-b2c1-4104-8752-eb39efd14d1c\") " pod="openstack/nova-api-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.367040 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5595c9f3-b2c1-4104-8752-eb39efd14d1c-config-data\") pod \"nova-api-0\" (UID: \"5595c9f3-b2c1-4104-8752-eb39efd14d1c\") " pod="openstack/nova-api-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.370487 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.372191 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.379850 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.392466 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drttj\" (UniqueName: \"kubernetes.io/projected/5595c9f3-b2c1-4104-8752-eb39efd14d1c-kube-api-access-drttj\") pod \"nova-api-0\" (UID: \"5595c9f3-b2c1-4104-8752-eb39efd14d1c\") " pod="openstack/nova-api-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.431495 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.460153 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.460244 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr7b8\" (UniqueName: \"kubernetes.io/projected/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-kube-api-access-cr7b8\") pod \"nova-metadata-0\" (UID: \"b1e6736b-61bb-4787-8f0c-81bfb3934c0b\") " pod="openstack/nova-metadata-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.460339 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b1e6736b-61bb-4787-8f0c-81bfb3934c0b\") " pod="openstack/nova-metadata-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.460395 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jw7z\" (UniqueName: \"kubernetes.io/projected/e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e-kube-api-access-9jw7z\") pod \"nova-cell1-novncproxy-0\" (UID: \"e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.460463 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-config-data\") pod \"nova-metadata-0\" (UID: \"b1e6736b-61bb-4787-8f0c-81bfb3934c0b\") " pod="openstack/nova-metadata-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.460518 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-logs\") pod \"nova-metadata-0\" (UID: \"b1e6736b-61bb-4787-8f0c-81bfb3934c0b\") " pod="openstack/nova-metadata-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.460543 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.468337 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.484746 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.506416 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jw7z\" (UniqueName: \"kubernetes.io/projected/e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e-kube-api-access-9jw7z\") pod \"nova-cell1-novncproxy-0\" (UID: \"e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.518874 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.557296 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.562584 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-zvrlk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.563400 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr7b8\" (UniqueName: \"kubernetes.io/projected/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-kube-api-access-cr7b8\") pod \"nova-metadata-0\" (UID: \"b1e6736b-61bb-4787-8f0c-81bfb3934c0b\") " pod="openstack/nova-metadata-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.563535 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b1e6736b-61bb-4787-8f0c-81bfb3934c0b\") " pod="openstack/nova-metadata-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.563639 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-config-data\") pod \"nova-metadata-0\" (UID: \"b1e6736b-61bb-4787-8f0c-81bfb3934c0b\") " pod="openstack/nova-metadata-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.563686 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-logs\") pod \"nova-metadata-0\" (UID: \"b1e6736b-61bb-4787-8f0c-81bfb3934c0b\") " pod="openstack/nova-metadata-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.564662 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-logs\") pod \"nova-metadata-0\" (UID: \"b1e6736b-61bb-4787-8f0c-81bfb3934c0b\") " pod="openstack/nova-metadata-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.566781 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b1e6736b-61bb-4787-8f0c-81bfb3934c0b\") " pod="openstack/nova-metadata-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.569639 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-config-data\") pod \"nova-metadata-0\" (UID: \"b1e6736b-61bb-4787-8f0c-81bfb3934c0b\") " pod="openstack/nova-metadata-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.590066 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-slzgk"] Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.591595 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.607237 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr7b8\" (UniqueName: \"kubernetes.io/projected/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-kube-api-access-cr7b8\") pod \"nova-metadata-0\" (UID: \"b1e6736b-61bb-4787-8f0c-81bfb3934c0b\") " pod="openstack/nova-metadata-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.622655 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-slzgk"] Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.738555 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.749609 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.759027 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.771319 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.771338 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jc97\" (UniqueName: \"kubernetes.io/projected/ce614908-558e-45de-ac61-f095149fee19-kube-api-access-8jc97\") pod \"dnsmasq-dns-757b4f8459-slzgk\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.771523 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-dns-svc\") pod \"dnsmasq-dns-757b4f8459-slzgk\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.771600 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-slzgk\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.771625 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-slzgk\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.771649 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-config\") pod \"dnsmasq-dns-757b4f8459-slzgk\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.771678 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-slzgk\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.831767 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.875003 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6fb7b9a-85bc-4023-b2f6-68f713287770-config-data\") pod \"nova-scheduler-0\" (UID: \"a6fb7b9a-85bc-4023-b2f6-68f713287770\") " pod="openstack/nova-scheduler-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.875113 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-dns-svc\") pod \"dnsmasq-dns-757b4f8459-slzgk\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.875170 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-slzgk\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.875195 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-slzgk\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.875218 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-config\") pod \"dnsmasq-dns-757b4f8459-slzgk\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.875251 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6fb7b9a-85bc-4023-b2f6-68f713287770-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a6fb7b9a-85bc-4023-b2f6-68f713287770\") " pod="openstack/nova-scheduler-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.875276 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-slzgk\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.875331 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2k8m\" (UniqueName: \"kubernetes.io/projected/a6fb7b9a-85bc-4023-b2f6-68f713287770-kube-api-access-x2k8m\") pod \"nova-scheduler-0\" (UID: \"a6fb7b9a-85bc-4023-b2f6-68f713287770\") " pod="openstack/nova-scheduler-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.875368 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jc97\" (UniqueName: \"kubernetes.io/projected/ce614908-558e-45de-ac61-f095149fee19-kube-api-access-8jc97\") pod \"dnsmasq-dns-757b4f8459-slzgk\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.877034 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-slzgk\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.877051 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-config\") pod \"dnsmasq-dns-757b4f8459-slzgk\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.877448 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-dns-svc\") pod \"dnsmasq-dns-757b4f8459-slzgk\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.878125 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-slzgk\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.878908 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-slzgk\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.908677 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jc97\" (UniqueName: \"kubernetes.io/projected/ce614908-558e-45de-ac61-f095149fee19-kube-api-access-8jc97\") pod \"dnsmasq-dns-757b4f8459-slzgk\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.949245 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.980656 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6fb7b9a-85bc-4023-b2f6-68f713287770-config-data\") pod \"nova-scheduler-0\" (UID: \"a6fb7b9a-85bc-4023-b2f6-68f713287770\") " pod="openstack/nova-scheduler-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.980801 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6fb7b9a-85bc-4023-b2f6-68f713287770-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a6fb7b9a-85bc-4023-b2f6-68f713287770\") " pod="openstack/nova-scheduler-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.980868 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2k8m\" (UniqueName: \"kubernetes.io/projected/a6fb7b9a-85bc-4023-b2f6-68f713287770-kube-api-access-x2k8m\") pod \"nova-scheduler-0\" (UID: \"a6fb7b9a-85bc-4023-b2f6-68f713287770\") " pod="openstack/nova-scheduler-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.987549 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6fb7b9a-85bc-4023-b2f6-68f713287770-config-data\") pod \"nova-scheduler-0\" (UID: \"a6fb7b9a-85bc-4023-b2f6-68f713287770\") " pod="openstack/nova-scheduler-0" Dec 01 09:31:21 crc kubenswrapper[4867]: I1201 09:31:21.987556 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6fb7b9a-85bc-4023-b2f6-68f713287770-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a6fb7b9a-85bc-4023-b2f6-68f713287770\") " pod="openstack/nova-scheduler-0" Dec 01 09:31:22 crc kubenswrapper[4867]: I1201 09:31:22.032415 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2k8m\" (UniqueName: \"kubernetes.io/projected/a6fb7b9a-85bc-4023-b2f6-68f713287770-kube-api-access-x2k8m\") pod \"nova-scheduler-0\" (UID: \"a6fb7b9a-85bc-4023-b2f6-68f713287770\") " pod="openstack/nova-scheduler-0" Dec 01 09:31:22 crc kubenswrapper[4867]: I1201 09:31:22.075365 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 09:31:22 crc kubenswrapper[4867]: I1201 09:31:22.358031 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 09:31:22 crc kubenswrapper[4867]: I1201 09:31:22.394640 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:31:22 crc kubenswrapper[4867]: I1201 09:31:22.581557 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-zvrlk"] Dec 01 09:31:22 crc kubenswrapper[4867]: I1201 09:31:22.731040 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:31:22 crc kubenswrapper[4867]: I1201 09:31:22.928050 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-c846795f4-k7mlj" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.038871 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-slzgk"] Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.086241 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.119279 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-slzgk" event={"ID":"ce614908-558e-45de-ac61-f095149fee19","Type":"ContainerStarted","Data":"aa79e1566426062e2a1ff93e1297e6aba8bb4a72927c298aab4a570d6796b018"} Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.121640 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5595c9f3-b2c1-4104-8752-eb39efd14d1c","Type":"ContainerStarted","Data":"1c5a720f3aed51b5334847810837fec9878aeb813f0c521c0828d8bde871186c"} Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.124153 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b1e6736b-61bb-4787-8f0c-81bfb3934c0b","Type":"ContainerStarted","Data":"bcad559af516f5ef02124a730daec31105679dca9c9c597b6bed72b49f1df503"} Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.130166 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a6fb7b9a-85bc-4023-b2f6-68f713287770","Type":"ContainerStarted","Data":"5ea5ec93a4277105498b9380bdc8727bb480e6f1d27cdb1523210c33b2801b4d"} Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.134453 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h9tkh" event={"ID":"b0cf63e8-e586-4cac-990d-43c976b30366","Type":"ContainerStarted","Data":"3e66c7bd639d6ab9515de8baec77b8b7dd2066339af695f38caf96ac73af8c73"} Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.137064 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e","Type":"ContainerStarted","Data":"9cfbacc64023085117a2c6121b215ff83baadc38e3d7bf45959a3aeceeb20074"} Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.139751 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-zvrlk" event={"ID":"d4cb4881-0b4e-4085-9627-1efc85a5efaa","Type":"ContainerStarted","Data":"7bd7bee536bed5d8ab073736cae20620eed752dc74f597458092ed36afd1ec9e"} Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.172945 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h9tkh" podStartSLOduration=2.628536898 podStartE2EDuration="10.172926519s" podCreationTimestamp="2025-12-01 09:31:13 +0000 UTC" firstStartedPulling="2025-12-01 09:31:14.954082649 +0000 UTC m=+1396.413469403" lastFinishedPulling="2025-12-01 09:31:22.49847227 +0000 UTC m=+1403.957859024" observedRunningTime="2025-12-01 09:31:23.15660786 +0000 UTC m=+1404.615994614" watchObservedRunningTime="2025-12-01 09:31:23.172926519 +0000 UTC m=+1404.632313273" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.181996 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-zvrlk" podStartSLOduration=3.181979267 podStartE2EDuration="3.181979267s" podCreationTimestamp="2025-12-01 09:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:31:23.181366791 +0000 UTC m=+1404.640753555" watchObservedRunningTime="2025-12-01 09:31:23.181979267 +0000 UTC m=+1404.641366021" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.287179 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-md2jq"] Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.292055 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-md2jq" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.297243 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-d47c7cb76-srf4p" podUID="8bd4fac2-df2c-4aab-bf00-99b54a83ddca" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.305491 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.305551 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.347902 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbt2h\" (UniqueName: \"kubernetes.io/projected/aba41ac2-513c-437d-a26c-7b341306bddc-kube-api-access-rbt2h\") pod \"nova-cell1-conductor-db-sync-md2jq\" (UID: \"aba41ac2-513c-437d-a26c-7b341306bddc\") " pod="openstack/nova-cell1-conductor-db-sync-md2jq" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.347965 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aba41ac2-513c-437d-a26c-7b341306bddc-config-data\") pod \"nova-cell1-conductor-db-sync-md2jq\" (UID: \"aba41ac2-513c-437d-a26c-7b341306bddc\") " pod="openstack/nova-cell1-conductor-db-sync-md2jq" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.347989 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aba41ac2-513c-437d-a26c-7b341306bddc-scripts\") pod \"nova-cell1-conductor-db-sync-md2jq\" (UID: \"aba41ac2-513c-437d-a26c-7b341306bddc\") " pod="openstack/nova-cell1-conductor-db-sync-md2jq" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.348033 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aba41ac2-513c-437d-a26c-7b341306bddc-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-md2jq\" (UID: \"aba41ac2-513c-437d-a26c-7b341306bddc\") " pod="openstack/nova-cell1-conductor-db-sync-md2jq" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.364207 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-md2jq"] Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.449767 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbt2h\" (UniqueName: \"kubernetes.io/projected/aba41ac2-513c-437d-a26c-7b341306bddc-kube-api-access-rbt2h\") pod \"nova-cell1-conductor-db-sync-md2jq\" (UID: \"aba41ac2-513c-437d-a26c-7b341306bddc\") " pod="openstack/nova-cell1-conductor-db-sync-md2jq" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.449909 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aba41ac2-513c-437d-a26c-7b341306bddc-config-data\") pod \"nova-cell1-conductor-db-sync-md2jq\" (UID: \"aba41ac2-513c-437d-a26c-7b341306bddc\") " pod="openstack/nova-cell1-conductor-db-sync-md2jq" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.449945 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aba41ac2-513c-437d-a26c-7b341306bddc-scripts\") pod \"nova-cell1-conductor-db-sync-md2jq\" (UID: \"aba41ac2-513c-437d-a26c-7b341306bddc\") " pod="openstack/nova-cell1-conductor-db-sync-md2jq" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.450000 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aba41ac2-513c-437d-a26c-7b341306bddc-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-md2jq\" (UID: \"aba41ac2-513c-437d-a26c-7b341306bddc\") " pod="openstack/nova-cell1-conductor-db-sync-md2jq" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.478253 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aba41ac2-513c-437d-a26c-7b341306bddc-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-md2jq\" (UID: \"aba41ac2-513c-437d-a26c-7b341306bddc\") " pod="openstack/nova-cell1-conductor-db-sync-md2jq" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.478939 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aba41ac2-513c-437d-a26c-7b341306bddc-scripts\") pod \"nova-cell1-conductor-db-sync-md2jq\" (UID: \"aba41ac2-513c-437d-a26c-7b341306bddc\") " pod="openstack/nova-cell1-conductor-db-sync-md2jq" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.491835 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aba41ac2-513c-437d-a26c-7b341306bddc-config-data\") pod \"nova-cell1-conductor-db-sync-md2jq\" (UID: \"aba41ac2-513c-437d-a26c-7b341306bddc\") " pod="openstack/nova-cell1-conductor-db-sync-md2jq" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.501825 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbt2h\" (UniqueName: \"kubernetes.io/projected/aba41ac2-513c-437d-a26c-7b341306bddc-kube-api-access-rbt2h\") pod \"nova-cell1-conductor-db-sync-md2jq\" (UID: \"aba41ac2-513c-437d-a26c-7b341306bddc\") " pod="openstack/nova-cell1-conductor-db-sync-md2jq" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.739559 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-md2jq" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.948053 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h9tkh" Dec 01 09:31:23 crc kubenswrapper[4867]: I1201 09:31:23.948361 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h9tkh" Dec 01 09:31:24 crc kubenswrapper[4867]: I1201 09:31:24.172873 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-zvrlk" event={"ID":"d4cb4881-0b4e-4085-9627-1efc85a5efaa","Type":"ContainerStarted","Data":"caae03d03e7fecae1a95ba8eb1c57e96e353740e93fb825166e570ab1a4d60b8"} Dec 01 09:31:24 crc kubenswrapper[4867]: I1201 09:31:24.205537 4867 generic.go:334] "Generic (PLEG): container finished" podID="ce614908-558e-45de-ac61-f095149fee19" containerID="6f450b6a19142aa05ee93a7e928d8d391e4ce7caa724c0bc3b7f09a7e6341c91" exitCode=0 Dec 01 09:31:24 crc kubenswrapper[4867]: I1201 09:31:24.205741 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-slzgk" event={"ID":"ce614908-558e-45de-ac61-f095149fee19","Type":"ContainerDied","Data":"6f450b6a19142aa05ee93a7e928d8d391e4ce7caa724c0bc3b7f09a7e6341c91"} Dec 01 09:31:24 crc kubenswrapper[4867]: I1201 09:31:24.306014 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-md2jq"] Dec 01 09:31:25 crc kubenswrapper[4867]: I1201 09:31:25.021576 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h9tkh" podUID="b0cf63e8-e586-4cac-990d-43c976b30366" containerName="registry-server" probeResult="failure" output=< Dec 01 09:31:25 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Dec 01 09:31:25 crc kubenswrapper[4867]: > Dec 01 09:31:25 crc kubenswrapper[4867]: I1201 09:31:25.229493 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-slzgk" event={"ID":"ce614908-558e-45de-ac61-f095149fee19","Type":"ContainerStarted","Data":"a0941972b92dae92cff4525420fc6089a42f6b0cceffb85f50f3a0f83eee7dab"} Dec 01 09:31:25 crc kubenswrapper[4867]: I1201 09:31:25.230201 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:25 crc kubenswrapper[4867]: I1201 09:31:25.244774 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-md2jq" event={"ID":"aba41ac2-513c-437d-a26c-7b341306bddc","Type":"ContainerStarted","Data":"41246256b811696ce11c99c101cb562ee0d889974e71ac8343234377721d4eb6"} Dec 01 09:31:25 crc kubenswrapper[4867]: I1201 09:31:25.244818 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-md2jq" event={"ID":"aba41ac2-513c-437d-a26c-7b341306bddc","Type":"ContainerStarted","Data":"9322595dcab95928114d1229b303aecadda923eaee05d67e89cc2afd036cbdc4"} Dec 01 09:31:25 crc kubenswrapper[4867]: I1201 09:31:25.265052 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-slzgk" podStartSLOduration=4.265034094 podStartE2EDuration="4.265034094s" podCreationTimestamp="2025-12-01 09:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:31:25.259240015 +0000 UTC m=+1406.718626769" watchObservedRunningTime="2025-12-01 09:31:25.265034094 +0000 UTC m=+1406.724420848" Dec 01 09:31:26 crc kubenswrapper[4867]: I1201 09:31:26.778037 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-md2jq" podStartSLOduration=3.778013571 podStartE2EDuration="3.778013571s" podCreationTimestamp="2025-12-01 09:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:31:25.287633216 +0000 UTC m=+1406.747019970" watchObservedRunningTime="2025-12-01 09:31:26.778013571 +0000 UTC m=+1408.237400325" Dec 01 09:31:26 crc kubenswrapper[4867]: I1201 09:31:26.786006 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:31:26 crc kubenswrapper[4867]: I1201 09:31:26.800628 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 09:31:31 crc kubenswrapper[4867]: I1201 09:31:31.308360 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a6fb7b9a-85bc-4023-b2f6-68f713287770","Type":"ContainerStarted","Data":"bfd28e1d2c95d44b53eebc41d19e2b116d59a7788455f21671526e32e62b3652"} Dec 01 09:31:31 crc kubenswrapper[4867]: I1201 09:31:31.311775 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e","Type":"ContainerStarted","Data":"406e873ec09ab9d6606a78dbdbaf0c24aeca9d702366856d72f817972bbcfcec"} Dec 01 09:31:31 crc kubenswrapper[4867]: I1201 09:31:31.311879 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://406e873ec09ab9d6606a78dbdbaf0c24aeca9d702366856d72f817972bbcfcec" gracePeriod=30 Dec 01 09:31:31 crc kubenswrapper[4867]: I1201 09:31:31.313683 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5595c9f3-b2c1-4104-8752-eb39efd14d1c","Type":"ContainerStarted","Data":"e3bc1bf094aed3c51c28c93519216afad44fb9221213eb311833df29b37fb5d5"} Dec 01 09:31:31 crc kubenswrapper[4867]: I1201 09:31:31.313728 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5595c9f3-b2c1-4104-8752-eb39efd14d1c","Type":"ContainerStarted","Data":"6abdf750f48576e332533d7b2ec64160cb859f9cbc98a9e36ce4297ebbeb02e2"} Dec 01 09:31:31 crc kubenswrapper[4867]: I1201 09:31:31.318190 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b1e6736b-61bb-4787-8f0c-81bfb3934c0b","Type":"ContainerStarted","Data":"f3ef8a0c20261aa0b9cd0cfc5dd04d9f041ba12261facb14043e30a41c0a4788"} Dec 01 09:31:31 crc kubenswrapper[4867]: I1201 09:31:31.318222 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b1e6736b-61bb-4787-8f0c-81bfb3934c0b","Type":"ContainerStarted","Data":"9ad51f9fd9a76b6958eab2b67b127be3f2bf3548e3ddf9051911fa9aafdcb204"} Dec 01 09:31:31 crc kubenswrapper[4867]: I1201 09:31:31.318315 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b1e6736b-61bb-4787-8f0c-81bfb3934c0b" containerName="nova-metadata-log" containerID="cri-o://9ad51f9fd9a76b6958eab2b67b127be3f2bf3548e3ddf9051911fa9aafdcb204" gracePeriod=30 Dec 01 09:31:31 crc kubenswrapper[4867]: I1201 09:31:31.318511 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b1e6736b-61bb-4787-8f0c-81bfb3934c0b" containerName="nova-metadata-metadata" containerID="cri-o://f3ef8a0c20261aa0b9cd0cfc5dd04d9f041ba12261facb14043e30a41c0a4788" gracePeriod=30 Dec 01 09:31:31 crc kubenswrapper[4867]: I1201 09:31:31.335692 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.508587962 podStartE2EDuration="10.335670037s" podCreationTimestamp="2025-12-01 09:31:21 +0000 UTC" firstStartedPulling="2025-12-01 09:31:23.017588348 +0000 UTC m=+1404.476975102" lastFinishedPulling="2025-12-01 09:31:29.844670423 +0000 UTC m=+1411.304057177" observedRunningTime="2025-12-01 09:31:31.328669484 +0000 UTC m=+1412.788056238" watchObservedRunningTime="2025-12-01 09:31:31.335670037 +0000 UTC m=+1412.795056791" Dec 01 09:31:31 crc kubenswrapper[4867]: I1201 09:31:31.382631 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.828667672 podStartE2EDuration="10.382614416s" podCreationTimestamp="2025-12-01 09:31:21 +0000 UTC" firstStartedPulling="2025-12-01 09:31:22.387546131 +0000 UTC m=+1403.846932885" lastFinishedPulling="2025-12-01 09:31:29.941492875 +0000 UTC m=+1411.400879629" observedRunningTime="2025-12-01 09:31:31.360733006 +0000 UTC m=+1412.820119760" watchObservedRunningTime="2025-12-01 09:31:31.382614416 +0000 UTC m=+1412.842001170" Dec 01 09:31:31 crc kubenswrapper[4867]: I1201 09:31:31.384385 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.898211024 podStartE2EDuration="10.384379385s" podCreationTimestamp="2025-12-01 09:31:21 +0000 UTC" firstStartedPulling="2025-12-01 09:31:22.453288888 +0000 UTC m=+1403.912675642" lastFinishedPulling="2025-12-01 09:31:29.939457249 +0000 UTC m=+1411.398844003" observedRunningTime="2025-12-01 09:31:31.377907638 +0000 UTC m=+1412.837294392" watchObservedRunningTime="2025-12-01 09:31:31.384379385 +0000 UTC m=+1412.843766139" Dec 01 09:31:31 crc kubenswrapper[4867]: I1201 09:31:31.404239 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.222171699 podStartE2EDuration="10.40421697s" podCreationTimestamp="2025-12-01 09:31:21 +0000 UTC" firstStartedPulling="2025-12-01 09:31:22.776013009 +0000 UTC m=+1404.235399753" lastFinishedPulling="2025-12-01 09:31:29.95805826 +0000 UTC m=+1411.417445024" observedRunningTime="2025-12-01 09:31:31.394544625 +0000 UTC m=+1412.853931379" watchObservedRunningTime="2025-12-01 09:31:31.40421697 +0000 UTC m=+1412.863603724" Dec 01 09:31:31 crc kubenswrapper[4867]: I1201 09:31:31.519798 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 09:31:31 crc kubenswrapper[4867]: I1201 09:31:31.519869 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 09:31:31 crc kubenswrapper[4867]: I1201 09:31:31.558557 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:31:31 crc kubenswrapper[4867]: I1201 09:31:31.832774 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 09:31:31 crc kubenswrapper[4867]: I1201 09:31:31.832838 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 09:31:31 crc kubenswrapper[4867]: I1201 09:31:31.952017 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.059904 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-cvdlf"] Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.060479 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" podUID="795e69a3-9500-444d-8d6e-af50ede7c060" containerName="dnsmasq-dns" containerID="cri-o://6a8e1b17c2b8ffea65982c5b03bd3eb663e08f3f4a7c08352d8ed2645d0382b0" gracePeriod=10 Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.077233 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.077273 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.231435 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.395604 4867 generic.go:334] "Generic (PLEG): container finished" podID="b1e6736b-61bb-4787-8f0c-81bfb3934c0b" containerID="f3ef8a0c20261aa0b9cd0cfc5dd04d9f041ba12261facb14043e30a41c0a4788" exitCode=0 Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.395645 4867 generic.go:334] "Generic (PLEG): container finished" podID="b1e6736b-61bb-4787-8f0c-81bfb3934c0b" containerID="9ad51f9fd9a76b6958eab2b67b127be3f2bf3548e3ddf9051911fa9aafdcb204" exitCode=143 Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.395702 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b1e6736b-61bb-4787-8f0c-81bfb3934c0b","Type":"ContainerDied","Data":"f3ef8a0c20261aa0b9cd0cfc5dd04d9f041ba12261facb14043e30a41c0a4788"} Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.395772 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b1e6736b-61bb-4787-8f0c-81bfb3934c0b","Type":"ContainerDied","Data":"9ad51f9fd9a76b6958eab2b67b127be3f2bf3548e3ddf9051911fa9aafdcb204"} Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.395787 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b1e6736b-61bb-4787-8f0c-81bfb3934c0b","Type":"ContainerDied","Data":"bcad559af516f5ef02124a730daec31105679dca9c9c597b6bed72b49f1df503"} Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.395798 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcad559af516f5ef02124a730daec31105679dca9c9c597b6bed72b49f1df503" Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.403288 4867 generic.go:334] "Generic (PLEG): container finished" podID="795e69a3-9500-444d-8d6e-af50ede7c060" containerID="6a8e1b17c2b8ffea65982c5b03bd3eb663e08f3f4a7c08352d8ed2645d0382b0" exitCode=0 Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.403584 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" event={"ID":"795e69a3-9500-444d-8d6e-af50ede7c060","Type":"ContainerDied","Data":"6a8e1b17c2b8ffea65982c5b03bd3eb663e08f3f4a7c08352d8ed2645d0382b0"} Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.463526 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.529348 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.564150 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5595c9f3-b2c1-4104-8752-eb39efd14d1c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.582367 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-logs\") pod \"b1e6736b-61bb-4787-8f0c-81bfb3934c0b\" (UID: \"b1e6736b-61bb-4787-8f0c-81bfb3934c0b\") " Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.582461 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-combined-ca-bundle\") pod \"b1e6736b-61bb-4787-8f0c-81bfb3934c0b\" (UID: \"b1e6736b-61bb-4787-8f0c-81bfb3934c0b\") " Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.582701 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-config-data\") pod \"b1e6736b-61bb-4787-8f0c-81bfb3934c0b\" (UID: \"b1e6736b-61bb-4787-8f0c-81bfb3934c0b\") " Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.582755 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr7b8\" (UniqueName: \"kubernetes.io/projected/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-kube-api-access-cr7b8\") pod \"b1e6736b-61bb-4787-8f0c-81bfb3934c0b\" (UID: \"b1e6736b-61bb-4787-8f0c-81bfb3934c0b\") " Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.584621 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-logs" (OuterVolumeSpecName: "logs") pod "b1e6736b-61bb-4787-8f0c-81bfb3934c0b" (UID: "b1e6736b-61bb-4787-8f0c-81bfb3934c0b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.607259 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5595c9f3-b2c1-4104-8752-eb39efd14d1c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.627034 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-kube-api-access-cr7b8" (OuterVolumeSpecName: "kube-api-access-cr7b8") pod "b1e6736b-61bb-4787-8f0c-81bfb3934c0b" (UID: "b1e6736b-61bb-4787-8f0c-81bfb3934c0b"). InnerVolumeSpecName "kube-api-access-cr7b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.636942 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1e6736b-61bb-4787-8f0c-81bfb3934c0b" (UID: "b1e6736b-61bb-4787-8f0c-81bfb3934c0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.686469 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.717171 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.717198 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr7b8\" (UniqueName: \"kubernetes.io/projected/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-kube-api-access-cr7b8\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.771187 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-config-data" (OuterVolumeSpecName: "config-data") pod "b1e6736b-61bb-4787-8f0c-81bfb3934c0b" (UID: "b1e6736b-61bb-4787-8f0c-81bfb3934c0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.827048 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1e6736b-61bb-4787-8f0c-81bfb3934c0b-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.879730 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.906900 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-c846795f4-k7mlj" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.930477 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-dns-swift-storage-0\") pod \"795e69a3-9500-444d-8d6e-af50ede7c060\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.930615 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-config\") pod \"795e69a3-9500-444d-8d6e-af50ede7c060\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.930647 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-dns-svc\") pod \"795e69a3-9500-444d-8d6e-af50ede7c060\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.930808 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fclg8\" (UniqueName: \"kubernetes.io/projected/795e69a3-9500-444d-8d6e-af50ede7c060-kube-api-access-fclg8\") pod \"795e69a3-9500-444d-8d6e-af50ede7c060\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.935950 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-ovsdbserver-sb\") pod \"795e69a3-9500-444d-8d6e-af50ede7c060\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " Dec 01 09:31:32 crc kubenswrapper[4867]: I1201 09:31:32.936022 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-ovsdbserver-nb\") pod \"795e69a3-9500-444d-8d6e-af50ede7c060\" (UID: \"795e69a3-9500-444d-8d6e-af50ede7c060\") " Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.010157 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/795e69a3-9500-444d-8d6e-af50ede7c060-kube-api-access-fclg8" (OuterVolumeSpecName: "kube-api-access-fclg8") pod "795e69a3-9500-444d-8d6e-af50ede7c060" (UID: "795e69a3-9500-444d-8d6e-af50ede7c060"). InnerVolumeSpecName "kube-api-access-fclg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.044538 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fclg8\" (UniqueName: \"kubernetes.io/projected/795e69a3-9500-444d-8d6e-af50ede7c060-kube-api-access-fclg8\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.108444 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "795e69a3-9500-444d-8d6e-af50ede7c060" (UID: "795e69a3-9500-444d-8d6e-af50ede7c060"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.123435 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-config" (OuterVolumeSpecName: "config") pod "795e69a3-9500-444d-8d6e-af50ede7c060" (UID: "795e69a3-9500-444d-8d6e-af50ede7c060"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.129664 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "795e69a3-9500-444d-8d6e-af50ede7c060" (UID: "795e69a3-9500-444d-8d6e-af50ede7c060"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.134786 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "795e69a3-9500-444d-8d6e-af50ede7c060" (UID: "795e69a3-9500-444d-8d6e-af50ede7c060"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.140345 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "795e69a3-9500-444d-8d6e-af50ede7c060" (UID: "795e69a3-9500-444d-8d6e-af50ede7c060"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.146681 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.146951 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.147017 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.147082 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.147140 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/795e69a3-9500-444d-8d6e-af50ede7c060-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.294967 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-d47c7cb76-srf4p" podUID="8bd4fac2-df2c-4aab-bf00-99b54a83ddca" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.442545 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.442962 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.443224 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-cvdlf" event={"ID":"795e69a3-9500-444d-8d6e-af50ede7c060","Type":"ContainerDied","Data":"17646e1825c13641d883345ffd53716becf72db2c6103a6e7f4cda506a887cba"} Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.443295 4867 scope.go:117] "RemoveContainer" containerID="6a8e1b17c2b8ffea65982c5b03bd3eb663e08f3f4a7c08352d8ed2645d0382b0" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.472994 4867 scope.go:117] "RemoveContainer" containerID="5aa650b1b5b93fe95bb5bd4047a27e48a4c82478dcc0cb4ef20d9a4ee025cfe9" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.489663 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.524379 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.575302 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:31:33 crc kubenswrapper[4867]: E1201 09:31:33.575727 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795e69a3-9500-444d-8d6e-af50ede7c060" containerName="init" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.575743 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="795e69a3-9500-444d-8d6e-af50ede7c060" containerName="init" Dec 01 09:31:33 crc kubenswrapper[4867]: E1201 09:31:33.575765 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1e6736b-61bb-4787-8f0c-81bfb3934c0b" containerName="nova-metadata-metadata" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.575772 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1e6736b-61bb-4787-8f0c-81bfb3934c0b" containerName="nova-metadata-metadata" Dec 01 09:31:33 crc kubenswrapper[4867]: E1201 09:31:33.575790 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1e6736b-61bb-4787-8f0c-81bfb3934c0b" containerName="nova-metadata-log" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.575797 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1e6736b-61bb-4787-8f0c-81bfb3934c0b" containerName="nova-metadata-log" Dec 01 09:31:33 crc kubenswrapper[4867]: E1201 09:31:33.575821 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795e69a3-9500-444d-8d6e-af50ede7c060" containerName="dnsmasq-dns" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.575857 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="795e69a3-9500-444d-8d6e-af50ede7c060" containerName="dnsmasq-dns" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.576032 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="795e69a3-9500-444d-8d6e-af50ede7c060" containerName="dnsmasq-dns" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.576051 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1e6736b-61bb-4787-8f0c-81bfb3934c0b" containerName="nova-metadata-log" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.576066 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1e6736b-61bb-4787-8f0c-81bfb3934c0b" containerName="nova-metadata-metadata" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.577216 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.580245 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.580312 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.599770 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-cvdlf"] Dec 01 09:31:33 crc kubenswrapper[4867]: E1201 09:31:33.604671 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod795e69a3_9500_444d_8d6e_af50ede7c060.slice\": RecentStats: unable to find data in memory cache]" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.611150 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-cvdlf"] Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.621877 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.654812 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtbpb\" (UniqueName: \"kubernetes.io/projected/9071cf8c-0cf8-463d-9857-024b96bc1dd4-kube-api-access-mtbpb\") pod \"nova-metadata-0\" (UID: \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.654879 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9071cf8c-0cf8-463d-9857-024b96bc1dd4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.654916 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9071cf8c-0cf8-463d-9857-024b96bc1dd4-logs\") pod \"nova-metadata-0\" (UID: \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.654995 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9071cf8c-0cf8-463d-9857-024b96bc1dd4-config-data\") pod \"nova-metadata-0\" (UID: \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.655018 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9071cf8c-0cf8-463d-9857-024b96bc1dd4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.756738 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtbpb\" (UniqueName: \"kubernetes.io/projected/9071cf8c-0cf8-463d-9857-024b96bc1dd4-kube-api-access-mtbpb\") pod \"nova-metadata-0\" (UID: \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.756791 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9071cf8c-0cf8-463d-9857-024b96bc1dd4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.756834 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9071cf8c-0cf8-463d-9857-024b96bc1dd4-logs\") pod \"nova-metadata-0\" (UID: \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.756922 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9071cf8c-0cf8-463d-9857-024b96bc1dd4-config-data\") pod \"nova-metadata-0\" (UID: \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.756971 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9071cf8c-0cf8-463d-9857-024b96bc1dd4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.757565 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9071cf8c-0cf8-463d-9857-024b96bc1dd4-logs\") pod \"nova-metadata-0\" (UID: \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.761473 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9071cf8c-0cf8-463d-9857-024b96bc1dd4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.774452 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9071cf8c-0cf8-463d-9857-024b96bc1dd4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.774463 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9071cf8c-0cf8-463d-9857-024b96bc1dd4-config-data\") pod \"nova-metadata-0\" (UID: \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.776337 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtbpb\" (UniqueName: \"kubernetes.io/projected/9071cf8c-0cf8-463d-9857-024b96bc1dd4-kube-api-access-mtbpb\") pod \"nova-metadata-0\" (UID: \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:33 crc kubenswrapper[4867]: I1201 09:31:33.913340 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:31:34 crc kubenswrapper[4867]: I1201 09:31:34.564396 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:31:34 crc kubenswrapper[4867]: I1201 09:31:34.838744 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="795e69a3-9500-444d-8d6e-af50ede7c060" path="/var/lib/kubelet/pods/795e69a3-9500-444d-8d6e-af50ede7c060/volumes" Dec 01 09:31:34 crc kubenswrapper[4867]: I1201 09:31:34.839473 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1e6736b-61bb-4787-8f0c-81bfb3934c0b" path="/var/lib/kubelet/pods/b1e6736b-61bb-4787-8f0c-81bfb3934c0b/volumes" Dec 01 09:31:35 crc kubenswrapper[4867]: I1201 09:31:35.068643 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h9tkh" podUID="b0cf63e8-e586-4cac-990d-43c976b30366" containerName="registry-server" probeResult="failure" output=< Dec 01 09:31:35 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Dec 01 09:31:35 crc kubenswrapper[4867]: > Dec 01 09:31:35 crc kubenswrapper[4867]: I1201 09:31:35.480765 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9071cf8c-0cf8-463d-9857-024b96bc1dd4","Type":"ContainerStarted","Data":"a2c4b0e1717b161b1bad2731f04844da5615c3af26b2dc5dae8fc4cac808bf79"} Dec 01 09:31:35 crc kubenswrapper[4867]: I1201 09:31:35.480863 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9071cf8c-0cf8-463d-9857-024b96bc1dd4","Type":"ContainerStarted","Data":"deeaba356c4a387be863b8a0924c8683b2591b4696a01e3789c95afe5b0ede27"} Dec 01 09:31:35 crc kubenswrapper[4867]: I1201 09:31:35.480878 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9071cf8c-0cf8-463d-9857-024b96bc1dd4","Type":"ContainerStarted","Data":"a0fddd7451b06c84575ce73d1522c7409817d7429352a5a4d57aa8bbde6aca24"} Dec 01 09:31:37 crc kubenswrapper[4867]: I1201 09:31:37.499716 4867 generic.go:334] "Generic (PLEG): container finished" podID="aba41ac2-513c-437d-a26c-7b341306bddc" containerID="41246256b811696ce11c99c101cb562ee0d889974e71ac8343234377721d4eb6" exitCode=0 Dec 01 09:31:37 crc kubenswrapper[4867]: I1201 09:31:37.499771 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-md2jq" event={"ID":"aba41ac2-513c-437d-a26c-7b341306bddc","Type":"ContainerDied","Data":"41246256b811696ce11c99c101cb562ee0d889974e71ac8343234377721d4eb6"} Dec 01 09:31:37 crc kubenswrapper[4867]: I1201 09:31:37.530389 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.530364219 podStartE2EDuration="4.530364219s" podCreationTimestamp="2025-12-01 09:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:31:35.512960127 +0000 UTC m=+1416.972346901" watchObservedRunningTime="2025-12-01 09:31:37.530364219 +0000 UTC m=+1418.989750973" Dec 01 09:31:38 crc kubenswrapper[4867]: I1201 09:31:38.511974 4867 generic.go:334] "Generic (PLEG): container finished" podID="d4cb4881-0b4e-4085-9627-1efc85a5efaa" containerID="caae03d03e7fecae1a95ba8eb1c57e96e353740e93fb825166e570ab1a4d60b8" exitCode=0 Dec 01 09:31:38 crc kubenswrapper[4867]: I1201 09:31:38.512035 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-zvrlk" event={"ID":"d4cb4881-0b4e-4085-9627-1efc85a5efaa","Type":"ContainerDied","Data":"caae03d03e7fecae1a95ba8eb1c57e96e353740e93fb825166e570ab1a4d60b8"} Dec 01 09:31:38 crc kubenswrapper[4867]: I1201 09:31:38.892720 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-md2jq" Dec 01 09:31:38 crc kubenswrapper[4867]: I1201 09:31:38.913428 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 09:31:38 crc kubenswrapper[4867]: I1201 09:31:38.913475 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.070797 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbt2h\" (UniqueName: \"kubernetes.io/projected/aba41ac2-513c-437d-a26c-7b341306bddc-kube-api-access-rbt2h\") pod \"aba41ac2-513c-437d-a26c-7b341306bddc\" (UID: \"aba41ac2-513c-437d-a26c-7b341306bddc\") " Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.070945 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aba41ac2-513c-437d-a26c-7b341306bddc-scripts\") pod \"aba41ac2-513c-437d-a26c-7b341306bddc\" (UID: \"aba41ac2-513c-437d-a26c-7b341306bddc\") " Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.071088 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aba41ac2-513c-437d-a26c-7b341306bddc-config-data\") pod \"aba41ac2-513c-437d-a26c-7b341306bddc\" (UID: \"aba41ac2-513c-437d-a26c-7b341306bddc\") " Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.071172 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aba41ac2-513c-437d-a26c-7b341306bddc-combined-ca-bundle\") pod \"aba41ac2-513c-437d-a26c-7b341306bddc\" (UID: \"aba41ac2-513c-437d-a26c-7b341306bddc\") " Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.076168 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aba41ac2-513c-437d-a26c-7b341306bddc-kube-api-access-rbt2h" (OuterVolumeSpecName: "kube-api-access-rbt2h") pod "aba41ac2-513c-437d-a26c-7b341306bddc" (UID: "aba41ac2-513c-437d-a26c-7b341306bddc"). InnerVolumeSpecName "kube-api-access-rbt2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.077788 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aba41ac2-513c-437d-a26c-7b341306bddc-scripts" (OuterVolumeSpecName: "scripts") pod "aba41ac2-513c-437d-a26c-7b341306bddc" (UID: "aba41ac2-513c-437d-a26c-7b341306bddc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.101684 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aba41ac2-513c-437d-a26c-7b341306bddc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aba41ac2-513c-437d-a26c-7b341306bddc" (UID: "aba41ac2-513c-437d-a26c-7b341306bddc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.108767 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aba41ac2-513c-437d-a26c-7b341306bddc-config-data" (OuterVolumeSpecName: "config-data") pod "aba41ac2-513c-437d-a26c-7b341306bddc" (UID: "aba41ac2-513c-437d-a26c-7b341306bddc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.173642 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aba41ac2-513c-437d-a26c-7b341306bddc-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.173674 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aba41ac2-513c-437d-a26c-7b341306bddc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.173684 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbt2h\" (UniqueName: \"kubernetes.io/projected/aba41ac2-513c-437d-a26c-7b341306bddc-kube-api-access-rbt2h\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.173694 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aba41ac2-513c-437d-a26c-7b341306bddc-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.529870 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-md2jq" event={"ID":"aba41ac2-513c-437d-a26c-7b341306bddc","Type":"ContainerDied","Data":"9322595dcab95928114d1229b303aecadda923eaee05d67e89cc2afd036cbdc4"} Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.529960 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-md2jq" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.532876 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9322595dcab95928114d1229b303aecadda923eaee05d67e89cc2afd036cbdc4" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.582215 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.620996 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 09:31:39 crc kubenswrapper[4867]: E1201 09:31:39.621558 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aba41ac2-513c-437d-a26c-7b341306bddc" containerName="nova-cell1-conductor-db-sync" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.621580 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba41ac2-513c-437d-a26c-7b341306bddc" containerName="nova-cell1-conductor-db-sync" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.621859 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="aba41ac2-513c-437d-a26c-7b341306bddc" containerName="nova-cell1-conductor-db-sync" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.622634 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.627277 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.650325 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.785280 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c8nl\" (UniqueName: \"kubernetes.io/projected/13963b70-5558-4e19-9b73-555d74be129a-kube-api-access-7c8nl\") pod \"nova-cell1-conductor-0\" (UID: \"13963b70-5558-4e19-9b73-555d74be129a\") " pod="openstack/nova-cell1-conductor-0" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.785388 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13963b70-5558-4e19-9b73-555d74be129a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"13963b70-5558-4e19-9b73-555d74be129a\") " pod="openstack/nova-cell1-conductor-0" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.785432 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13963b70-5558-4e19-9b73-555d74be129a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"13963b70-5558-4e19-9b73-555d74be129a\") " pod="openstack/nova-cell1-conductor-0" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.886629 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13963b70-5558-4e19-9b73-555d74be129a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"13963b70-5558-4e19-9b73-555d74be129a\") " pod="openstack/nova-cell1-conductor-0" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.886674 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13963b70-5558-4e19-9b73-555d74be129a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"13963b70-5558-4e19-9b73-555d74be129a\") " pod="openstack/nova-cell1-conductor-0" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.886763 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c8nl\" (UniqueName: \"kubernetes.io/projected/13963b70-5558-4e19-9b73-555d74be129a-kube-api-access-7c8nl\") pod \"nova-cell1-conductor-0\" (UID: \"13963b70-5558-4e19-9b73-555d74be129a\") " pod="openstack/nova-cell1-conductor-0" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.891719 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13963b70-5558-4e19-9b73-555d74be129a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"13963b70-5558-4e19-9b73-555d74be129a\") " pod="openstack/nova-cell1-conductor-0" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.892196 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13963b70-5558-4e19-9b73-555d74be129a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"13963b70-5558-4e19-9b73-555d74be129a\") " pod="openstack/nova-cell1-conductor-0" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.903777 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c8nl\" (UniqueName: \"kubernetes.io/projected/13963b70-5558-4e19-9b73-555d74be129a-kube-api-access-7c8nl\") pod \"nova-cell1-conductor-0\" (UID: \"13963b70-5558-4e19-9b73-555d74be129a\") " pod="openstack/nova-cell1-conductor-0" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.947069 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 09:31:39 crc kubenswrapper[4867]: I1201 09:31:39.982026 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-zvrlk" Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.089724 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzmsw\" (UniqueName: \"kubernetes.io/projected/d4cb4881-0b4e-4085-9627-1efc85a5efaa-kube-api-access-jzmsw\") pod \"d4cb4881-0b4e-4085-9627-1efc85a5efaa\" (UID: \"d4cb4881-0b4e-4085-9627-1efc85a5efaa\") " Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.090065 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4cb4881-0b4e-4085-9627-1efc85a5efaa-scripts\") pod \"d4cb4881-0b4e-4085-9627-1efc85a5efaa\" (UID: \"d4cb4881-0b4e-4085-9627-1efc85a5efaa\") " Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.090142 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4cb4881-0b4e-4085-9627-1efc85a5efaa-config-data\") pod \"d4cb4881-0b4e-4085-9627-1efc85a5efaa\" (UID: \"d4cb4881-0b4e-4085-9627-1efc85a5efaa\") " Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.090184 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4cb4881-0b4e-4085-9627-1efc85a5efaa-combined-ca-bundle\") pod \"d4cb4881-0b4e-4085-9627-1efc85a5efaa\" (UID: \"d4cb4881-0b4e-4085-9627-1efc85a5efaa\") " Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.096027 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4cb4881-0b4e-4085-9627-1efc85a5efaa-kube-api-access-jzmsw" (OuterVolumeSpecName: "kube-api-access-jzmsw") pod "d4cb4881-0b4e-4085-9627-1efc85a5efaa" (UID: "d4cb4881-0b4e-4085-9627-1efc85a5efaa"). InnerVolumeSpecName "kube-api-access-jzmsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.102092 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4cb4881-0b4e-4085-9627-1efc85a5efaa-scripts" (OuterVolumeSpecName: "scripts") pod "d4cb4881-0b4e-4085-9627-1efc85a5efaa" (UID: "d4cb4881-0b4e-4085-9627-1efc85a5efaa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.124464 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4cb4881-0b4e-4085-9627-1efc85a5efaa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4cb4881-0b4e-4085-9627-1efc85a5efaa" (UID: "d4cb4881-0b4e-4085-9627-1efc85a5efaa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.132189 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4cb4881-0b4e-4085-9627-1efc85a5efaa-config-data" (OuterVolumeSpecName: "config-data") pod "d4cb4881-0b4e-4085-9627-1efc85a5efaa" (UID: "d4cb4881-0b4e-4085-9627-1efc85a5efaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.192802 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzmsw\" (UniqueName: \"kubernetes.io/projected/d4cb4881-0b4e-4085-9627-1efc85a5efaa-kube-api-access-jzmsw\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.192886 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4cb4881-0b4e-4085-9627-1efc85a5efaa-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.192897 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4cb4881-0b4e-4085-9627-1efc85a5efaa-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.192906 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4cb4881-0b4e-4085-9627-1efc85a5efaa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.438230 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.541136 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"13963b70-5558-4e19-9b73-555d74be129a","Type":"ContainerStarted","Data":"9de32333f543cb6930995dd38f3e8bd26db4ccf958c64c102382b98cdabd762f"} Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.545074 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-zvrlk" event={"ID":"d4cb4881-0b4e-4085-9627-1efc85a5efaa","Type":"ContainerDied","Data":"7bd7bee536bed5d8ab073736cae20620eed752dc74f597458092ed36afd1ec9e"} Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.545122 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bd7bee536bed5d8ab073736cae20620eed752dc74f597458092ed36afd1ec9e" Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.545447 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-zvrlk" Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.711016 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.711790 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5595c9f3-b2c1-4104-8752-eb39efd14d1c" containerName="nova-api-api" containerID="cri-o://e3bc1bf094aed3c51c28c93519216afad44fb9221213eb311833df29b37fb5d5" gracePeriod=30 Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.711636 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5595c9f3-b2c1-4104-8752-eb39efd14d1c" containerName="nova-api-log" containerID="cri-o://6abdf750f48576e332533d7b2ec64160cb859f9cbc98a9e36ce4297ebbeb02e2" gracePeriod=30 Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.739475 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.739886 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a6fb7b9a-85bc-4023-b2f6-68f713287770" containerName="nova-scheduler-scheduler" containerID="cri-o://bfd28e1d2c95d44b53eebc41d19e2b116d59a7788455f21671526e32e62b3652" gracePeriod=30 Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.775983 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.776263 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9071cf8c-0cf8-463d-9857-024b96bc1dd4" containerName="nova-metadata-log" containerID="cri-o://deeaba356c4a387be863b8a0924c8683b2591b4696a01e3789c95afe5b0ede27" gracePeriod=30 Dec 01 09:31:40 crc kubenswrapper[4867]: I1201 09:31:40.776837 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9071cf8c-0cf8-463d-9857-024b96bc1dd4" containerName="nova-metadata-metadata" containerID="cri-o://a2c4b0e1717b161b1bad2731f04844da5615c3af26b2dc5dae8fc4cac808bf79" gracePeriod=30 Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.461754 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.558732 4867 generic.go:334] "Generic (PLEG): container finished" podID="9071cf8c-0cf8-463d-9857-024b96bc1dd4" containerID="a2c4b0e1717b161b1bad2731f04844da5615c3af26b2dc5dae8fc4cac808bf79" exitCode=0 Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.558763 4867 generic.go:334] "Generic (PLEG): container finished" podID="9071cf8c-0cf8-463d-9857-024b96bc1dd4" containerID="deeaba356c4a387be863b8a0924c8683b2591b4696a01e3789c95afe5b0ede27" exitCode=143 Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.558778 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.558834 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9071cf8c-0cf8-463d-9857-024b96bc1dd4","Type":"ContainerDied","Data":"a2c4b0e1717b161b1bad2731f04844da5615c3af26b2dc5dae8fc4cac808bf79"} Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.558861 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9071cf8c-0cf8-463d-9857-024b96bc1dd4","Type":"ContainerDied","Data":"deeaba356c4a387be863b8a0924c8683b2591b4696a01e3789c95afe5b0ede27"} Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.558871 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9071cf8c-0cf8-463d-9857-024b96bc1dd4","Type":"ContainerDied","Data":"a0fddd7451b06c84575ce73d1522c7409817d7429352a5a4d57aa8bbde6aca24"} Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.558887 4867 scope.go:117] "RemoveContainer" containerID="a2c4b0e1717b161b1bad2731f04844da5615c3af26b2dc5dae8fc4cac808bf79" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.566225 4867 generic.go:334] "Generic (PLEG): container finished" podID="5595c9f3-b2c1-4104-8752-eb39efd14d1c" containerID="6abdf750f48576e332533d7b2ec64160cb859f9cbc98a9e36ce4297ebbeb02e2" exitCode=143 Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.566311 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5595c9f3-b2c1-4104-8752-eb39efd14d1c","Type":"ContainerDied","Data":"6abdf750f48576e332533d7b2ec64160cb859f9cbc98a9e36ce4297ebbeb02e2"} Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.572483 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"13963b70-5558-4e19-9b73-555d74be129a","Type":"ContainerStarted","Data":"f450155039476c7b18bbabbcd6e918ea5e5cd0163b6cee2a37f072e174fcdb69"} Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.572735 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.601404 4867 scope.go:117] "RemoveContainer" containerID="deeaba356c4a387be863b8a0924c8683b2591b4696a01e3789c95afe5b0ede27" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.605957 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.605943353 podStartE2EDuration="2.605943353s" podCreationTimestamp="2025-12-01 09:31:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:31:41.603335742 +0000 UTC m=+1423.062722496" watchObservedRunningTime="2025-12-01 09:31:41.605943353 +0000 UTC m=+1423.065330107" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.639614 4867 scope.go:117] "RemoveContainer" containerID="a2c4b0e1717b161b1bad2731f04844da5615c3af26b2dc5dae8fc4cac808bf79" Dec 01 09:31:41 crc kubenswrapper[4867]: E1201 09:31:41.640066 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2c4b0e1717b161b1bad2731f04844da5615c3af26b2dc5dae8fc4cac808bf79\": container with ID starting with a2c4b0e1717b161b1bad2731f04844da5615c3af26b2dc5dae8fc4cac808bf79 not found: ID does not exist" containerID="a2c4b0e1717b161b1bad2731f04844da5615c3af26b2dc5dae8fc4cac808bf79" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.640098 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2c4b0e1717b161b1bad2731f04844da5615c3af26b2dc5dae8fc4cac808bf79"} err="failed to get container status \"a2c4b0e1717b161b1bad2731f04844da5615c3af26b2dc5dae8fc4cac808bf79\": rpc error: code = NotFound desc = could not find container \"a2c4b0e1717b161b1bad2731f04844da5615c3af26b2dc5dae8fc4cac808bf79\": container with ID starting with a2c4b0e1717b161b1bad2731f04844da5615c3af26b2dc5dae8fc4cac808bf79 not found: ID does not exist" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.640121 4867 scope.go:117] "RemoveContainer" containerID="deeaba356c4a387be863b8a0924c8683b2591b4696a01e3789c95afe5b0ede27" Dec 01 09:31:41 crc kubenswrapper[4867]: E1201 09:31:41.640514 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deeaba356c4a387be863b8a0924c8683b2591b4696a01e3789c95afe5b0ede27\": container with ID starting with deeaba356c4a387be863b8a0924c8683b2591b4696a01e3789c95afe5b0ede27 not found: ID does not exist" containerID="deeaba356c4a387be863b8a0924c8683b2591b4696a01e3789c95afe5b0ede27" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.640545 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deeaba356c4a387be863b8a0924c8683b2591b4696a01e3789c95afe5b0ede27"} err="failed to get container status \"deeaba356c4a387be863b8a0924c8683b2591b4696a01e3789c95afe5b0ede27\": rpc error: code = NotFound desc = could not find container \"deeaba356c4a387be863b8a0924c8683b2591b4696a01e3789c95afe5b0ede27\": container with ID starting with deeaba356c4a387be863b8a0924c8683b2591b4696a01e3789c95afe5b0ede27 not found: ID does not exist" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.640567 4867 scope.go:117] "RemoveContainer" containerID="a2c4b0e1717b161b1bad2731f04844da5615c3af26b2dc5dae8fc4cac808bf79" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.640883 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2c4b0e1717b161b1bad2731f04844da5615c3af26b2dc5dae8fc4cac808bf79"} err="failed to get container status \"a2c4b0e1717b161b1bad2731f04844da5615c3af26b2dc5dae8fc4cac808bf79\": rpc error: code = NotFound desc = could not find container \"a2c4b0e1717b161b1bad2731f04844da5615c3af26b2dc5dae8fc4cac808bf79\": container with ID starting with a2c4b0e1717b161b1bad2731f04844da5615c3af26b2dc5dae8fc4cac808bf79 not found: ID does not exist" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.640986 4867 scope.go:117] "RemoveContainer" containerID="deeaba356c4a387be863b8a0924c8683b2591b4696a01e3789c95afe5b0ede27" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.641540 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deeaba356c4a387be863b8a0924c8683b2591b4696a01e3789c95afe5b0ede27"} err="failed to get container status \"deeaba356c4a387be863b8a0924c8683b2591b4696a01e3789c95afe5b0ede27\": rpc error: code = NotFound desc = could not find container \"deeaba356c4a387be863b8a0924c8683b2591b4696a01e3789c95afe5b0ede27\": container with ID starting with deeaba356c4a387be863b8a0924c8683b2591b4696a01e3789c95afe5b0ede27 not found: ID does not exist" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.651252 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9071cf8c-0cf8-463d-9857-024b96bc1dd4-logs\") pod \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\" (UID: \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\") " Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.651568 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9071cf8c-0cf8-463d-9857-024b96bc1dd4-config-data\") pod \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\" (UID: \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\") " Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.651638 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9071cf8c-0cf8-463d-9857-024b96bc1dd4-nova-metadata-tls-certs\") pod \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\" (UID: \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\") " Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.651718 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9071cf8c-0cf8-463d-9857-024b96bc1dd4-combined-ca-bundle\") pod \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\" (UID: \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\") " Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.651976 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtbpb\" (UniqueName: \"kubernetes.io/projected/9071cf8c-0cf8-463d-9857-024b96bc1dd4-kube-api-access-mtbpb\") pod \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\" (UID: \"9071cf8c-0cf8-463d-9857-024b96bc1dd4\") " Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.652390 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9071cf8c-0cf8-463d-9857-024b96bc1dd4-logs" (OuterVolumeSpecName: "logs") pod "9071cf8c-0cf8-463d-9857-024b96bc1dd4" (UID: "9071cf8c-0cf8-463d-9857-024b96bc1dd4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.668171 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9071cf8c-0cf8-463d-9857-024b96bc1dd4-kube-api-access-mtbpb" (OuterVolumeSpecName: "kube-api-access-mtbpb") pod "9071cf8c-0cf8-463d-9857-024b96bc1dd4" (UID: "9071cf8c-0cf8-463d-9857-024b96bc1dd4"). InnerVolumeSpecName "kube-api-access-mtbpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.695275 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9071cf8c-0cf8-463d-9857-024b96bc1dd4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9071cf8c-0cf8-463d-9857-024b96bc1dd4" (UID: "9071cf8c-0cf8-463d-9857-024b96bc1dd4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.697515 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9071cf8c-0cf8-463d-9857-024b96bc1dd4-config-data" (OuterVolumeSpecName: "config-data") pod "9071cf8c-0cf8-463d-9857-024b96bc1dd4" (UID: "9071cf8c-0cf8-463d-9857-024b96bc1dd4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.742044 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9071cf8c-0cf8-463d-9857-024b96bc1dd4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9071cf8c-0cf8-463d-9857-024b96bc1dd4" (UID: "9071cf8c-0cf8-463d-9857-024b96bc1dd4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.758210 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtbpb\" (UniqueName: \"kubernetes.io/projected/9071cf8c-0cf8-463d-9857-024b96bc1dd4-kube-api-access-mtbpb\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.758269 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9071cf8c-0cf8-463d-9857-024b96bc1dd4-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.758281 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9071cf8c-0cf8-463d-9857-024b96bc1dd4-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.758293 4867 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9071cf8c-0cf8-463d-9857-024b96bc1dd4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.758304 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9071cf8c-0cf8-463d-9857-024b96bc1dd4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.928083 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.944566 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.954785 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:31:41 crc kubenswrapper[4867]: E1201 09:31:41.955280 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4cb4881-0b4e-4085-9627-1efc85a5efaa" containerName="nova-manage" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.955306 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4cb4881-0b4e-4085-9627-1efc85a5efaa" containerName="nova-manage" Dec 01 09:31:41 crc kubenswrapper[4867]: E1201 09:31:41.955345 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9071cf8c-0cf8-463d-9857-024b96bc1dd4" containerName="nova-metadata-log" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.955353 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9071cf8c-0cf8-463d-9857-024b96bc1dd4" containerName="nova-metadata-log" Dec 01 09:31:41 crc kubenswrapper[4867]: E1201 09:31:41.955368 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9071cf8c-0cf8-463d-9857-024b96bc1dd4" containerName="nova-metadata-metadata" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.955376 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9071cf8c-0cf8-463d-9857-024b96bc1dd4" containerName="nova-metadata-metadata" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.955613 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="9071cf8c-0cf8-463d-9857-024b96bc1dd4" containerName="nova-metadata-metadata" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.955655 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="9071cf8c-0cf8-463d-9857-024b96bc1dd4" containerName="nova-metadata-log" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.955667 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4cb4881-0b4e-4085-9627-1efc85a5efaa" containerName="nova-manage" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.956895 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.960075 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.974498 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 09:31:41 crc kubenswrapper[4867]: I1201 09:31:41.984782 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:31:42 crc kubenswrapper[4867]: I1201 09:31:42.063833 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80eeedcc-8486-4395-a708-2fb4b89945d4-logs\") pod \"nova-metadata-0\" (UID: \"80eeedcc-8486-4395-a708-2fb4b89945d4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:42 crc kubenswrapper[4867]: I1201 09:31:42.063961 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80eeedcc-8486-4395-a708-2fb4b89945d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80eeedcc-8486-4395-a708-2fb4b89945d4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:42 crc kubenswrapper[4867]: I1201 09:31:42.064102 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80eeedcc-8486-4395-a708-2fb4b89945d4-config-data\") pod \"nova-metadata-0\" (UID: \"80eeedcc-8486-4395-a708-2fb4b89945d4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:42 crc kubenswrapper[4867]: I1201 09:31:42.064214 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6rxw\" (UniqueName: \"kubernetes.io/projected/80eeedcc-8486-4395-a708-2fb4b89945d4-kube-api-access-q6rxw\") pod \"nova-metadata-0\" (UID: \"80eeedcc-8486-4395-a708-2fb4b89945d4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:42 crc kubenswrapper[4867]: I1201 09:31:42.064299 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80eeedcc-8486-4395-a708-2fb4b89945d4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80eeedcc-8486-4395-a708-2fb4b89945d4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:42 crc kubenswrapper[4867]: E1201 09:31:42.079071 4867 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bfd28e1d2c95d44b53eebc41d19e2b116d59a7788455f21671526e32e62b3652" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 09:31:42 crc kubenswrapper[4867]: E1201 09:31:42.080885 4867 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bfd28e1d2c95d44b53eebc41d19e2b116d59a7788455f21671526e32e62b3652" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 09:31:42 crc kubenswrapper[4867]: E1201 09:31:42.082236 4867 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bfd28e1d2c95d44b53eebc41d19e2b116d59a7788455f21671526e32e62b3652" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 09:31:42 crc kubenswrapper[4867]: E1201 09:31:42.082294 4867 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a6fb7b9a-85bc-4023-b2f6-68f713287770" containerName="nova-scheduler-scheduler" Dec 01 09:31:42 crc kubenswrapper[4867]: I1201 09:31:42.165984 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6rxw\" (UniqueName: \"kubernetes.io/projected/80eeedcc-8486-4395-a708-2fb4b89945d4-kube-api-access-q6rxw\") pod \"nova-metadata-0\" (UID: \"80eeedcc-8486-4395-a708-2fb4b89945d4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:42 crc kubenswrapper[4867]: I1201 09:31:42.166306 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80eeedcc-8486-4395-a708-2fb4b89945d4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80eeedcc-8486-4395-a708-2fb4b89945d4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:42 crc kubenswrapper[4867]: I1201 09:31:42.166456 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80eeedcc-8486-4395-a708-2fb4b89945d4-logs\") pod \"nova-metadata-0\" (UID: \"80eeedcc-8486-4395-a708-2fb4b89945d4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:42 crc kubenswrapper[4867]: I1201 09:31:42.166583 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80eeedcc-8486-4395-a708-2fb4b89945d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80eeedcc-8486-4395-a708-2fb4b89945d4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:42 crc kubenswrapper[4867]: I1201 09:31:42.166701 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80eeedcc-8486-4395-a708-2fb4b89945d4-config-data\") pod \"nova-metadata-0\" (UID: \"80eeedcc-8486-4395-a708-2fb4b89945d4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:42 crc kubenswrapper[4867]: I1201 09:31:42.166985 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80eeedcc-8486-4395-a708-2fb4b89945d4-logs\") pod \"nova-metadata-0\" (UID: \"80eeedcc-8486-4395-a708-2fb4b89945d4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:42 crc kubenswrapper[4867]: I1201 09:31:42.171669 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80eeedcc-8486-4395-a708-2fb4b89945d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80eeedcc-8486-4395-a708-2fb4b89945d4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:42 crc kubenswrapper[4867]: I1201 09:31:42.172536 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80eeedcc-8486-4395-a708-2fb4b89945d4-config-data\") pod \"nova-metadata-0\" (UID: \"80eeedcc-8486-4395-a708-2fb4b89945d4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:42 crc kubenswrapper[4867]: I1201 09:31:42.189693 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80eeedcc-8486-4395-a708-2fb4b89945d4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80eeedcc-8486-4395-a708-2fb4b89945d4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:42 crc kubenswrapper[4867]: I1201 09:31:42.194265 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6rxw\" (UniqueName: \"kubernetes.io/projected/80eeedcc-8486-4395-a708-2fb4b89945d4-kube-api-access-q6rxw\") pod \"nova-metadata-0\" (UID: \"80eeedcc-8486-4395-a708-2fb4b89945d4\") " pod="openstack/nova-metadata-0" Dec 01 09:31:42 crc kubenswrapper[4867]: I1201 09:31:42.280423 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:31:42 crc kubenswrapper[4867]: I1201 09:31:42.762299 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:31:42 crc kubenswrapper[4867]: I1201 09:31:42.837860 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9071cf8c-0cf8-463d-9857-024b96bc1dd4" path="/var/lib/kubelet/pods/9071cf8c-0cf8-463d-9857-024b96bc1dd4/volumes" Dec 01 09:31:43 crc kubenswrapper[4867]: I1201 09:31:43.614071 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80eeedcc-8486-4395-a708-2fb4b89945d4","Type":"ContainerStarted","Data":"7da8d55b80bee370279a8cb0cab58eac6870e3802a895d79a05978f6112fb54d"} Dec 01 09:31:43 crc kubenswrapper[4867]: I1201 09:31:43.614377 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80eeedcc-8486-4395-a708-2fb4b89945d4","Type":"ContainerStarted","Data":"3dde498faff44d72cae817655d563c7bc28869a7d7522cc84119ae382c0b827b"} Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.035083 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h9tkh" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.150909 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h9tkh" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.298522 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h9tkh"] Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.482473 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.528195 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5595c9f3-b2c1-4104-8752-eb39efd14d1c-config-data\") pod \"5595c9f3-b2c1-4104-8752-eb39efd14d1c\" (UID: \"5595c9f3-b2c1-4104-8752-eb39efd14d1c\") " Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.528271 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5595c9f3-b2c1-4104-8752-eb39efd14d1c-logs\") pod \"5595c9f3-b2c1-4104-8752-eb39efd14d1c\" (UID: \"5595c9f3-b2c1-4104-8752-eb39efd14d1c\") " Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.528322 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5595c9f3-b2c1-4104-8752-eb39efd14d1c-combined-ca-bundle\") pod \"5595c9f3-b2c1-4104-8752-eb39efd14d1c\" (UID: \"5595c9f3-b2c1-4104-8752-eb39efd14d1c\") " Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.528359 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drttj\" (UniqueName: \"kubernetes.io/projected/5595c9f3-b2c1-4104-8752-eb39efd14d1c-kube-api-access-drttj\") pod \"5595c9f3-b2c1-4104-8752-eb39efd14d1c\" (UID: \"5595c9f3-b2c1-4104-8752-eb39efd14d1c\") " Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.529766 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5595c9f3-b2c1-4104-8752-eb39efd14d1c-logs" (OuterVolumeSpecName: "logs") pod "5595c9f3-b2c1-4104-8752-eb39efd14d1c" (UID: "5595c9f3-b2c1-4104-8752-eb39efd14d1c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.535981 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5595c9f3-b2c1-4104-8752-eb39efd14d1c-kube-api-access-drttj" (OuterVolumeSpecName: "kube-api-access-drttj") pod "5595c9f3-b2c1-4104-8752-eb39efd14d1c" (UID: "5595c9f3-b2c1-4104-8752-eb39efd14d1c"). InnerVolumeSpecName "kube-api-access-drttj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.555410 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5595c9f3-b2c1-4104-8752-eb39efd14d1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5595c9f3-b2c1-4104-8752-eb39efd14d1c" (UID: "5595c9f3-b2c1-4104-8752-eb39efd14d1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.579889 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5595c9f3-b2c1-4104-8752-eb39efd14d1c-config-data" (OuterVolumeSpecName: "config-data") pod "5595c9f3-b2c1-4104-8752-eb39efd14d1c" (UID: "5595c9f3-b2c1-4104-8752-eb39efd14d1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.630201 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5595c9f3-b2c1-4104-8752-eb39efd14d1c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.630244 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5595c9f3-b2c1-4104-8752-eb39efd14d1c-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.630253 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5595c9f3-b2c1-4104-8752-eb39efd14d1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.630263 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drttj\" (UniqueName: \"kubernetes.io/projected/5595c9f3-b2c1-4104-8752-eb39efd14d1c-kube-api-access-drttj\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.630439 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80eeedcc-8486-4395-a708-2fb4b89945d4","Type":"ContainerStarted","Data":"6a3e8514ac7be518d49721c25fd01ea6ac55ff925d070872e28f3899e8490d7d"} Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.637882 4867 generic.go:334] "Generic (PLEG): container finished" podID="5595c9f3-b2c1-4104-8752-eb39efd14d1c" containerID="e3bc1bf094aed3c51c28c93519216afad44fb9221213eb311833df29b37fb5d5" exitCode=0 Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.638207 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.638704 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5595c9f3-b2c1-4104-8752-eb39efd14d1c","Type":"ContainerDied","Data":"e3bc1bf094aed3c51c28c93519216afad44fb9221213eb311833df29b37fb5d5"} Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.638731 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5595c9f3-b2c1-4104-8752-eb39efd14d1c","Type":"ContainerDied","Data":"1c5a720f3aed51b5334847810837fec9878aeb813f0c521c0828d8bde871186c"} Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.638759 4867 scope.go:117] "RemoveContainer" containerID="e3bc1bf094aed3c51c28c93519216afad44fb9221213eb311833df29b37fb5d5" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.673415 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.673397778 podStartE2EDuration="3.673397778s" podCreationTimestamp="2025-12-01 09:31:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:31:44.653320616 +0000 UTC m=+1426.112707370" watchObservedRunningTime="2025-12-01 09:31:44.673397778 +0000 UTC m=+1426.132784532" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.686769 4867 scope.go:117] "RemoveContainer" containerID="6abdf750f48576e332533d7b2ec64160cb859f9cbc98a9e36ce4297ebbeb02e2" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.689880 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.721879 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.742005 4867 scope.go:117] "RemoveContainer" containerID="e3bc1bf094aed3c51c28c93519216afad44fb9221213eb311833df29b37fb5d5" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.742460 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 09:31:44 crc kubenswrapper[4867]: E1201 09:31:44.742921 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5595c9f3-b2c1-4104-8752-eb39efd14d1c" containerName="nova-api-log" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.742933 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5595c9f3-b2c1-4104-8752-eb39efd14d1c" containerName="nova-api-log" Dec 01 09:31:44 crc kubenswrapper[4867]: E1201 09:31:44.742959 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5595c9f3-b2c1-4104-8752-eb39efd14d1c" containerName="nova-api-api" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.742965 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5595c9f3-b2c1-4104-8752-eb39efd14d1c" containerName="nova-api-api" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.743157 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="5595c9f3-b2c1-4104-8752-eb39efd14d1c" containerName="nova-api-api" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.743173 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="5595c9f3-b2c1-4104-8752-eb39efd14d1c" containerName="nova-api-log" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.757860 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:31:44 crc kubenswrapper[4867]: E1201 09:31:44.758419 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3bc1bf094aed3c51c28c93519216afad44fb9221213eb311833df29b37fb5d5\": container with ID starting with e3bc1bf094aed3c51c28c93519216afad44fb9221213eb311833df29b37fb5d5 not found: ID does not exist" containerID="e3bc1bf094aed3c51c28c93519216afad44fb9221213eb311833df29b37fb5d5" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.758485 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3bc1bf094aed3c51c28c93519216afad44fb9221213eb311833df29b37fb5d5"} err="failed to get container status \"e3bc1bf094aed3c51c28c93519216afad44fb9221213eb311833df29b37fb5d5\": rpc error: code = NotFound desc = could not find container \"e3bc1bf094aed3c51c28c93519216afad44fb9221213eb311833df29b37fb5d5\": container with ID starting with e3bc1bf094aed3c51c28c93519216afad44fb9221213eb311833df29b37fb5d5 not found: ID does not exist" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.758511 4867 scope.go:117] "RemoveContainer" containerID="6abdf750f48576e332533d7b2ec64160cb859f9cbc98a9e36ce4297ebbeb02e2" Dec 01 09:31:44 crc kubenswrapper[4867]: E1201 09:31:44.759887 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6abdf750f48576e332533d7b2ec64160cb859f9cbc98a9e36ce4297ebbeb02e2\": container with ID starting with 6abdf750f48576e332533d7b2ec64160cb859f9cbc98a9e36ce4297ebbeb02e2 not found: ID does not exist" containerID="6abdf750f48576e332533d7b2ec64160cb859f9cbc98a9e36ce4297ebbeb02e2" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.760106 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6abdf750f48576e332533d7b2ec64160cb859f9cbc98a9e36ce4297ebbeb02e2"} err="failed to get container status \"6abdf750f48576e332533d7b2ec64160cb859f9cbc98a9e36ce4297ebbeb02e2\": rpc error: code = NotFound desc = could not find container \"6abdf750f48576e332533d7b2ec64160cb859f9cbc98a9e36ce4297ebbeb02e2\": container with ID starting with 6abdf750f48576e332533d7b2ec64160cb859f9cbc98a9e36ce4297ebbeb02e2 not found: ID does not exist" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.762611 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.807710 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.834857 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e748b11-e821-47c1-850b-e9812ec6bafd-logs\") pod \"nova-api-0\" (UID: \"2e748b11-e821-47c1-850b-e9812ec6bafd\") " pod="openstack/nova-api-0" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.834921 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e748b11-e821-47c1-850b-e9812ec6bafd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2e748b11-e821-47c1-850b-e9812ec6bafd\") " pod="openstack/nova-api-0" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.834989 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgmhp\" (UniqueName: \"kubernetes.io/projected/2e748b11-e821-47c1-850b-e9812ec6bafd-kube-api-access-bgmhp\") pod \"nova-api-0\" (UID: \"2e748b11-e821-47c1-850b-e9812ec6bafd\") " pod="openstack/nova-api-0" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.835121 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e748b11-e821-47c1-850b-e9812ec6bafd-config-data\") pod \"nova-api-0\" (UID: \"2e748b11-e821-47c1-850b-e9812ec6bafd\") " pod="openstack/nova-api-0" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.846521 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5595c9f3-b2c1-4104-8752-eb39efd14d1c" path="/var/lib/kubelet/pods/5595c9f3-b2c1-4104-8752-eb39efd14d1c/volumes" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.936784 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e748b11-e821-47c1-850b-e9812ec6bafd-logs\") pod \"nova-api-0\" (UID: \"2e748b11-e821-47c1-850b-e9812ec6bafd\") " pod="openstack/nova-api-0" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.936845 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e748b11-e821-47c1-850b-e9812ec6bafd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2e748b11-e821-47c1-850b-e9812ec6bafd\") " pod="openstack/nova-api-0" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.936890 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgmhp\" (UniqueName: \"kubernetes.io/projected/2e748b11-e821-47c1-850b-e9812ec6bafd-kube-api-access-bgmhp\") pod \"nova-api-0\" (UID: \"2e748b11-e821-47c1-850b-e9812ec6bafd\") " pod="openstack/nova-api-0" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.936970 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e748b11-e821-47c1-850b-e9812ec6bafd-config-data\") pod \"nova-api-0\" (UID: \"2e748b11-e821-47c1-850b-e9812ec6bafd\") " pod="openstack/nova-api-0" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.937350 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e748b11-e821-47c1-850b-e9812ec6bafd-logs\") pod \"nova-api-0\" (UID: \"2e748b11-e821-47c1-850b-e9812ec6bafd\") " pod="openstack/nova-api-0" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.944927 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e748b11-e821-47c1-850b-e9812ec6bafd-config-data\") pod \"nova-api-0\" (UID: \"2e748b11-e821-47c1-850b-e9812ec6bafd\") " pod="openstack/nova-api-0" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.945753 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e748b11-e821-47c1-850b-e9812ec6bafd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2e748b11-e821-47c1-850b-e9812ec6bafd\") " pod="openstack/nova-api-0" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.976761 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.977636 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgmhp\" (UniqueName: \"kubernetes.io/projected/2e748b11-e821-47c1-850b-e9812ec6bafd-kube-api-access-bgmhp\") pod \"nova-api-0\" (UID: \"2e748b11-e821-47c1-850b-e9812ec6bafd\") " pod="openstack/nova-api-0" Dec 01 09:31:44 crc kubenswrapper[4867]: I1201 09:31:44.977743 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1847ceb6-9b89-4f58-8bc0-d70d28fe4890" containerName="kube-state-metrics" containerID="cri-o://4323803bf8fb0816a71ad49c6b0efb002dda412b606f62d82add8d6e4cdbba16" gracePeriod=30 Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.092323 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.522343 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.552225 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2k8m\" (UniqueName: \"kubernetes.io/projected/a6fb7b9a-85bc-4023-b2f6-68f713287770-kube-api-access-x2k8m\") pod \"a6fb7b9a-85bc-4023-b2f6-68f713287770\" (UID: \"a6fb7b9a-85bc-4023-b2f6-68f713287770\") " Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.552306 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6fb7b9a-85bc-4023-b2f6-68f713287770-config-data\") pod \"a6fb7b9a-85bc-4023-b2f6-68f713287770\" (UID: \"a6fb7b9a-85bc-4023-b2f6-68f713287770\") " Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.552343 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6fb7b9a-85bc-4023-b2f6-68f713287770-combined-ca-bundle\") pod \"a6fb7b9a-85bc-4023-b2f6-68f713287770\" (UID: \"a6fb7b9a-85bc-4023-b2f6-68f713287770\") " Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.563834 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6fb7b9a-85bc-4023-b2f6-68f713287770-kube-api-access-x2k8m" (OuterVolumeSpecName: "kube-api-access-x2k8m") pod "a6fb7b9a-85bc-4023-b2f6-68f713287770" (UID: "a6fb7b9a-85bc-4023-b2f6-68f713287770"). InnerVolumeSpecName "kube-api-access-x2k8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.622802 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6fb7b9a-85bc-4023-b2f6-68f713287770-config-data" (OuterVolumeSpecName: "config-data") pod "a6fb7b9a-85bc-4023-b2f6-68f713287770" (UID: "a6fb7b9a-85bc-4023-b2f6-68f713287770"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.628390 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6fb7b9a-85bc-4023-b2f6-68f713287770-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6fb7b9a-85bc-4023-b2f6-68f713287770" (UID: "a6fb7b9a-85bc-4023-b2f6-68f713287770"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.646917 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.654546 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6fb7b9a-85bc-4023-b2f6-68f713287770-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.654613 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2k8m\" (UniqueName: \"kubernetes.io/projected/a6fb7b9a-85bc-4023-b2f6-68f713287770-kube-api-access-x2k8m\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.654630 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6fb7b9a-85bc-4023-b2f6-68f713287770-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.670834 4867 generic.go:334] "Generic (PLEG): container finished" podID="1847ceb6-9b89-4f58-8bc0-d70d28fe4890" containerID="4323803bf8fb0816a71ad49c6b0efb002dda412b606f62d82add8d6e4cdbba16" exitCode=2 Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.670942 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.670954 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1847ceb6-9b89-4f58-8bc0-d70d28fe4890","Type":"ContainerDied","Data":"4323803bf8fb0816a71ad49c6b0efb002dda412b606f62d82add8d6e4cdbba16"} Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.671067 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1847ceb6-9b89-4f58-8bc0-d70d28fe4890","Type":"ContainerDied","Data":"9f23bc4842e0d1072afb42ce22297ba71e3977a87dca7bc359ad5327296b443b"} Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.671088 4867 scope.go:117] "RemoveContainer" containerID="4323803bf8fb0816a71ad49c6b0efb002dda412b606f62d82add8d6e4cdbba16" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.686588 4867 generic.go:334] "Generic (PLEG): container finished" podID="a6fb7b9a-85bc-4023-b2f6-68f713287770" containerID="bfd28e1d2c95d44b53eebc41d19e2b116d59a7788455f21671526e32e62b3652" exitCode=0 Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.686682 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a6fb7b9a-85bc-4023-b2f6-68f713287770","Type":"ContainerDied","Data":"bfd28e1d2c95d44b53eebc41d19e2b116d59a7788455f21671526e32e62b3652"} Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.686729 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a6fb7b9a-85bc-4023-b2f6-68f713287770","Type":"ContainerDied","Data":"5ea5ec93a4277105498b9380bdc8727bb480e6f1d27cdb1523210c33b2801b4d"} Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.686789 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.689158 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h9tkh" podUID="b0cf63e8-e586-4cac-990d-43c976b30366" containerName="registry-server" containerID="cri-o://3e66c7bd639d6ab9515de8baec77b8b7dd2066339af695f38caf96ac73af8c73" gracePeriod=2 Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.720174 4867 scope.go:117] "RemoveContainer" containerID="4323803bf8fb0816a71ad49c6b0efb002dda412b606f62d82add8d6e4cdbba16" Dec 01 09:31:45 crc kubenswrapper[4867]: E1201 09:31:45.722427 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4323803bf8fb0816a71ad49c6b0efb002dda412b606f62d82add8d6e4cdbba16\": container with ID starting with 4323803bf8fb0816a71ad49c6b0efb002dda412b606f62d82add8d6e4cdbba16 not found: ID does not exist" containerID="4323803bf8fb0816a71ad49c6b0efb002dda412b606f62d82add8d6e4cdbba16" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.722466 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4323803bf8fb0816a71ad49c6b0efb002dda412b606f62d82add8d6e4cdbba16"} err="failed to get container status \"4323803bf8fb0816a71ad49c6b0efb002dda412b606f62d82add8d6e4cdbba16\": rpc error: code = NotFound desc = could not find container \"4323803bf8fb0816a71ad49c6b0efb002dda412b606f62d82add8d6e4cdbba16\": container with ID starting with 4323803bf8fb0816a71ad49c6b0efb002dda412b606f62d82add8d6e4cdbba16 not found: ID does not exist" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.722487 4867 scope.go:117] "RemoveContainer" containerID="bfd28e1d2c95d44b53eebc41d19e2b116d59a7788455f21671526e32e62b3652" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.759175 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxpzc\" (UniqueName: \"kubernetes.io/projected/1847ceb6-9b89-4f58-8bc0-d70d28fe4890-kube-api-access-wxpzc\") pod \"1847ceb6-9b89-4f58-8bc0-d70d28fe4890\" (UID: \"1847ceb6-9b89-4f58-8bc0-d70d28fe4890\") " Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.775766 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.778272 4867 scope.go:117] "RemoveContainer" containerID="bfd28e1d2c95d44b53eebc41d19e2b116d59a7788455f21671526e32e62b3652" Dec 01 09:31:45 crc kubenswrapper[4867]: E1201 09:31:45.778690 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfd28e1d2c95d44b53eebc41d19e2b116d59a7788455f21671526e32e62b3652\": container with ID starting with bfd28e1d2c95d44b53eebc41d19e2b116d59a7788455f21671526e32e62b3652 not found: ID does not exist" containerID="bfd28e1d2c95d44b53eebc41d19e2b116d59a7788455f21671526e32e62b3652" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.778725 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd28e1d2c95d44b53eebc41d19e2b116d59a7788455f21671526e32e62b3652"} err="failed to get container status \"bfd28e1d2c95d44b53eebc41d19e2b116d59a7788455f21671526e32e62b3652\": rpc error: code = NotFound desc = could not find container \"bfd28e1d2c95d44b53eebc41d19e2b116d59a7788455f21671526e32e62b3652\": container with ID starting with bfd28e1d2c95d44b53eebc41d19e2b116d59a7788455f21671526e32e62b3652 not found: ID does not exist" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.786741 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.788658 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1847ceb6-9b89-4f58-8bc0-d70d28fe4890-kube-api-access-wxpzc" (OuterVolumeSpecName: "kube-api-access-wxpzc") pod "1847ceb6-9b89-4f58-8bc0-d70d28fe4890" (UID: "1847ceb6-9b89-4f58-8bc0-d70d28fe4890"). InnerVolumeSpecName "kube-api-access-wxpzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.803772 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.825074 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:31:45 crc kubenswrapper[4867]: E1201 09:31:45.825471 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1847ceb6-9b89-4f58-8bc0-d70d28fe4890" containerName="kube-state-metrics" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.825489 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1847ceb6-9b89-4f58-8bc0-d70d28fe4890" containerName="kube-state-metrics" Dec 01 09:31:45 crc kubenswrapper[4867]: E1201 09:31:45.825500 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6fb7b9a-85bc-4023-b2f6-68f713287770" containerName="nova-scheduler-scheduler" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.825508 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6fb7b9a-85bc-4023-b2f6-68f713287770" containerName="nova-scheduler-scheduler" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.825736 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1847ceb6-9b89-4f58-8bc0-d70d28fe4890" containerName="kube-state-metrics" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.825766 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6fb7b9a-85bc-4023-b2f6-68f713287770" containerName="nova-scheduler-scheduler" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.827086 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.831149 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.855016 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.860857 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c1d2a06-41e8-483e-8fe6-ce90496010bc-config-data\") pod \"nova-scheduler-0\" (UID: \"3c1d2a06-41e8-483e-8fe6-ce90496010bc\") " pod="openstack/nova-scheduler-0" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.860904 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtslm\" (UniqueName: \"kubernetes.io/projected/3c1d2a06-41e8-483e-8fe6-ce90496010bc-kube-api-access-dtslm\") pod \"nova-scheduler-0\" (UID: \"3c1d2a06-41e8-483e-8fe6-ce90496010bc\") " pod="openstack/nova-scheduler-0" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.861030 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1d2a06-41e8-483e-8fe6-ce90496010bc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3c1d2a06-41e8-483e-8fe6-ce90496010bc\") " pod="openstack/nova-scheduler-0" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.861095 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxpzc\" (UniqueName: \"kubernetes.io/projected/1847ceb6-9b89-4f58-8bc0-d70d28fe4890-kube-api-access-wxpzc\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.962923 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c1d2a06-41e8-483e-8fe6-ce90496010bc-config-data\") pod \"nova-scheduler-0\" (UID: \"3c1d2a06-41e8-483e-8fe6-ce90496010bc\") " pod="openstack/nova-scheduler-0" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.963293 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtslm\" (UniqueName: \"kubernetes.io/projected/3c1d2a06-41e8-483e-8fe6-ce90496010bc-kube-api-access-dtslm\") pod \"nova-scheduler-0\" (UID: \"3c1d2a06-41e8-483e-8fe6-ce90496010bc\") " pod="openstack/nova-scheduler-0" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.963403 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1d2a06-41e8-483e-8fe6-ce90496010bc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3c1d2a06-41e8-483e-8fe6-ce90496010bc\") " pod="openstack/nova-scheduler-0" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.970106 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1d2a06-41e8-483e-8fe6-ce90496010bc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3c1d2a06-41e8-483e-8fe6-ce90496010bc\") " pod="openstack/nova-scheduler-0" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.970697 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c1d2a06-41e8-483e-8fe6-ce90496010bc-config-data\") pod \"nova-scheduler-0\" (UID: \"3c1d2a06-41e8-483e-8fe6-ce90496010bc\") " pod="openstack/nova-scheduler-0" Dec 01 09:31:45 crc kubenswrapper[4867]: I1201 09:31:45.999023 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtslm\" (UniqueName: \"kubernetes.io/projected/3c1d2a06-41e8-483e-8fe6-ce90496010bc-kube-api-access-dtslm\") pod \"nova-scheduler-0\" (UID: \"3c1d2a06-41e8-483e-8fe6-ce90496010bc\") " pod="openstack/nova-scheduler-0" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.156716 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.158203 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.210337 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.219988 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.221336 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.226452 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.226631 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.231443 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.276202 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3d78f955-151d-46a9-9ef3-183051c318e6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3d78f955-151d-46a9-9ef3-183051c318e6\") " pod="openstack/kube-state-metrics-0" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.276275 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d78f955-151d-46a9-9ef3-183051c318e6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3d78f955-151d-46a9-9ef3-183051c318e6\") " pod="openstack/kube-state-metrics-0" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.276323 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5pqj\" (UniqueName: \"kubernetes.io/projected/3d78f955-151d-46a9-9ef3-183051c318e6-kube-api-access-v5pqj\") pod \"kube-state-metrics-0\" (UID: \"3d78f955-151d-46a9-9ef3-183051c318e6\") " pod="openstack/kube-state-metrics-0" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.276360 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d78f955-151d-46a9-9ef3-183051c318e6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3d78f955-151d-46a9-9ef3-183051c318e6\") " pod="openstack/kube-state-metrics-0" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.308318 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h9tkh" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.377397 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0cf63e8-e586-4cac-990d-43c976b30366-catalog-content\") pod \"b0cf63e8-e586-4cac-990d-43c976b30366\" (UID: \"b0cf63e8-e586-4cac-990d-43c976b30366\") " Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.377446 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0cf63e8-e586-4cac-990d-43c976b30366-utilities\") pod \"b0cf63e8-e586-4cac-990d-43c976b30366\" (UID: \"b0cf63e8-e586-4cac-990d-43c976b30366\") " Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.377574 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q768s\" (UniqueName: \"kubernetes.io/projected/b0cf63e8-e586-4cac-990d-43c976b30366-kube-api-access-q768s\") pod \"b0cf63e8-e586-4cac-990d-43c976b30366\" (UID: \"b0cf63e8-e586-4cac-990d-43c976b30366\") " Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.377735 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5pqj\" (UniqueName: \"kubernetes.io/projected/3d78f955-151d-46a9-9ef3-183051c318e6-kube-api-access-v5pqj\") pod \"kube-state-metrics-0\" (UID: \"3d78f955-151d-46a9-9ef3-183051c318e6\") " pod="openstack/kube-state-metrics-0" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.377779 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d78f955-151d-46a9-9ef3-183051c318e6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3d78f955-151d-46a9-9ef3-183051c318e6\") " pod="openstack/kube-state-metrics-0" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.377887 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3d78f955-151d-46a9-9ef3-183051c318e6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3d78f955-151d-46a9-9ef3-183051c318e6\") " pod="openstack/kube-state-metrics-0" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.377932 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d78f955-151d-46a9-9ef3-183051c318e6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3d78f955-151d-46a9-9ef3-183051c318e6\") " pod="openstack/kube-state-metrics-0" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.378389 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0cf63e8-e586-4cac-990d-43c976b30366-utilities" (OuterVolumeSpecName: "utilities") pod "b0cf63e8-e586-4cac-990d-43c976b30366" (UID: "b0cf63e8-e586-4cac-990d-43c976b30366"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.383629 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d78f955-151d-46a9-9ef3-183051c318e6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3d78f955-151d-46a9-9ef3-183051c318e6\") " pod="openstack/kube-state-metrics-0" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.384776 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d78f955-151d-46a9-9ef3-183051c318e6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3d78f955-151d-46a9-9ef3-183051c318e6\") " pod="openstack/kube-state-metrics-0" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.385670 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3d78f955-151d-46a9-9ef3-183051c318e6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3d78f955-151d-46a9-9ef3-183051c318e6\") " pod="openstack/kube-state-metrics-0" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.386066 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0cf63e8-e586-4cac-990d-43c976b30366-kube-api-access-q768s" (OuterVolumeSpecName: "kube-api-access-q768s") pod "b0cf63e8-e586-4cac-990d-43c976b30366" (UID: "b0cf63e8-e586-4cac-990d-43c976b30366"). InnerVolumeSpecName "kube-api-access-q768s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.397032 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5pqj\" (UniqueName: \"kubernetes.io/projected/3d78f955-151d-46a9-9ef3-183051c318e6-kube-api-access-v5pqj\") pod \"kube-state-metrics-0\" (UID: \"3d78f955-151d-46a9-9ef3-183051c318e6\") " pod="openstack/kube-state-metrics-0" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.480075 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0cf63e8-e586-4cac-990d-43c976b30366-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.480107 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q768s\" (UniqueName: \"kubernetes.io/projected/b0cf63e8-e586-4cac-990d-43c976b30366-kube-api-access-q768s\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.498412 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0cf63e8-e586-4cac-990d-43c976b30366-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0cf63e8-e586-4cac-990d-43c976b30366" (UID: "b0cf63e8-e586-4cac-990d-43c976b30366"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.582316 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0cf63e8-e586-4cac-990d-43c976b30366-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.620600 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.731174 4867 generic.go:334] "Generic (PLEG): container finished" podID="b0cf63e8-e586-4cac-990d-43c976b30366" containerID="3e66c7bd639d6ab9515de8baec77b8b7dd2066339af695f38caf96ac73af8c73" exitCode=0 Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.731230 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h9tkh" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.731244 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h9tkh" event={"ID":"b0cf63e8-e586-4cac-990d-43c976b30366","Type":"ContainerDied","Data":"3e66c7bd639d6ab9515de8baec77b8b7dd2066339af695f38caf96ac73af8c73"} Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.731272 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h9tkh" event={"ID":"b0cf63e8-e586-4cac-990d-43c976b30366","Type":"ContainerDied","Data":"9b4126d358320e01c3ca3ce3e5e60bde9ca5e7d0d333ade7e2d31e69977c1757"} Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.731289 4867 scope.go:117] "RemoveContainer" containerID="3e66c7bd639d6ab9515de8baec77b8b7dd2066339af695f38caf96ac73af8c73" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.732978 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.747148 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e748b11-e821-47c1-850b-e9812ec6bafd","Type":"ContainerStarted","Data":"4a74d24d68d5d840b8cb2237dbb0e3b14d9428896889eef774977cac0fd7e82c"} Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.747186 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e748b11-e821-47c1-850b-e9812ec6bafd","Type":"ContainerStarted","Data":"556309ab756db3ad47a82037ddad0b526638f0aa86226cd781b2196ad0bfc152"} Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.747198 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e748b11-e821-47c1-850b-e9812ec6bafd","Type":"ContainerStarted","Data":"bb97567b879d5f2898501b769ae0af5f64a643b41f697a65e9d4360d5fa50f32"} Dec 01 09:31:46 crc kubenswrapper[4867]: W1201 09:31:46.753968 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c1d2a06_41e8_483e_8fe6_ce90496010bc.slice/crio-59472ee870adda222e3fc892490bc4056f5a1bb46ab2b923e47ffe8740082444 WatchSource:0}: Error finding container 59472ee870adda222e3fc892490bc4056f5a1bb46ab2b923e47ffe8740082444: Status 404 returned error can't find the container with id 59472ee870adda222e3fc892490bc4056f5a1bb46ab2b923e47ffe8740082444 Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.779936 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.779917 podStartE2EDuration="2.779917s" podCreationTimestamp="2025-12-01 09:31:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:31:46.766245144 +0000 UTC m=+1428.225631908" watchObservedRunningTime="2025-12-01 09:31:46.779917 +0000 UTC m=+1428.239303754" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.807113 4867 scope.go:117] "RemoveContainer" containerID="751f6eefe0bf1acd98b88c85b62efbb078278ac783d65255d683739446fec959" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.925838 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1847ceb6-9b89-4f58-8bc0-d70d28fe4890" path="/var/lib/kubelet/pods/1847ceb6-9b89-4f58-8bc0-d70d28fe4890/volumes" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.930544 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6fb7b9a-85bc-4023-b2f6-68f713287770" path="/var/lib/kubelet/pods/a6fb7b9a-85bc-4023-b2f6-68f713287770/volumes" Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.931181 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h9tkh"] Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.931210 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h9tkh"] Dec 01 09:31:46 crc kubenswrapper[4867]: I1201 09:31:46.935984 4867 scope.go:117] "RemoveContainer" containerID="50138792c8328a73ac91b25e33642929bf7d8234445a538a22e7701e588a94ab" Dec 01 09:31:47 crc kubenswrapper[4867]: I1201 09:31:47.104960 4867 scope.go:117] "RemoveContainer" containerID="3e66c7bd639d6ab9515de8baec77b8b7dd2066339af695f38caf96ac73af8c73" Dec 01 09:31:47 crc kubenswrapper[4867]: E1201 09:31:47.106425 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e66c7bd639d6ab9515de8baec77b8b7dd2066339af695f38caf96ac73af8c73\": container with ID starting with 3e66c7bd639d6ab9515de8baec77b8b7dd2066339af695f38caf96ac73af8c73 not found: ID does not exist" containerID="3e66c7bd639d6ab9515de8baec77b8b7dd2066339af695f38caf96ac73af8c73" Dec 01 09:31:47 crc kubenswrapper[4867]: I1201 09:31:47.106491 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e66c7bd639d6ab9515de8baec77b8b7dd2066339af695f38caf96ac73af8c73"} err="failed to get container status \"3e66c7bd639d6ab9515de8baec77b8b7dd2066339af695f38caf96ac73af8c73\": rpc error: code = NotFound desc = could not find container \"3e66c7bd639d6ab9515de8baec77b8b7dd2066339af695f38caf96ac73af8c73\": container with ID starting with 3e66c7bd639d6ab9515de8baec77b8b7dd2066339af695f38caf96ac73af8c73 not found: ID does not exist" Dec 01 09:31:47 crc kubenswrapper[4867]: I1201 09:31:47.106525 4867 scope.go:117] "RemoveContainer" containerID="751f6eefe0bf1acd98b88c85b62efbb078278ac783d65255d683739446fec959" Dec 01 09:31:47 crc kubenswrapper[4867]: E1201 09:31:47.107858 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"751f6eefe0bf1acd98b88c85b62efbb078278ac783d65255d683739446fec959\": container with ID starting with 751f6eefe0bf1acd98b88c85b62efbb078278ac783d65255d683739446fec959 not found: ID does not exist" containerID="751f6eefe0bf1acd98b88c85b62efbb078278ac783d65255d683739446fec959" Dec 01 09:31:47 crc kubenswrapper[4867]: I1201 09:31:47.107919 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"751f6eefe0bf1acd98b88c85b62efbb078278ac783d65255d683739446fec959"} err="failed to get container status \"751f6eefe0bf1acd98b88c85b62efbb078278ac783d65255d683739446fec959\": rpc error: code = NotFound desc = could not find container \"751f6eefe0bf1acd98b88c85b62efbb078278ac783d65255d683739446fec959\": container with ID starting with 751f6eefe0bf1acd98b88c85b62efbb078278ac783d65255d683739446fec959 not found: ID does not exist" Dec 01 09:31:47 crc kubenswrapper[4867]: I1201 09:31:47.107941 4867 scope.go:117] "RemoveContainer" containerID="50138792c8328a73ac91b25e33642929bf7d8234445a538a22e7701e588a94ab" Dec 01 09:31:47 crc kubenswrapper[4867]: E1201 09:31:47.108454 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50138792c8328a73ac91b25e33642929bf7d8234445a538a22e7701e588a94ab\": container with ID starting with 50138792c8328a73ac91b25e33642929bf7d8234445a538a22e7701e588a94ab not found: ID does not exist" containerID="50138792c8328a73ac91b25e33642929bf7d8234445a538a22e7701e588a94ab" Dec 01 09:31:47 crc kubenswrapper[4867]: I1201 09:31:47.108495 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50138792c8328a73ac91b25e33642929bf7d8234445a538a22e7701e588a94ab"} err="failed to get container status \"50138792c8328a73ac91b25e33642929bf7d8234445a538a22e7701e588a94ab\": rpc error: code = NotFound desc = could not find container \"50138792c8328a73ac91b25e33642929bf7d8234445a538a22e7701e588a94ab\": container with ID starting with 50138792c8328a73ac91b25e33642929bf7d8234445a538a22e7701e588a94ab not found: ID does not exist" Dec 01 09:31:47 crc kubenswrapper[4867]: I1201 09:31:47.169902 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 09:31:47 crc kubenswrapper[4867]: I1201 09:31:47.280847 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 09:31:47 crc kubenswrapper[4867]: I1201 09:31:47.280900 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 09:31:47 crc kubenswrapper[4867]: I1201 09:31:47.449623 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:31:47 crc kubenswrapper[4867]: I1201 09:31:47.454557 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:31:47 crc kubenswrapper[4867]: I1201 09:31:47.768556 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3d78f955-151d-46a9-9ef3-183051c318e6","Type":"ContainerStarted","Data":"f8799399b25a015114536da45bc005e4853f3dc6248e4f8aa1c3dc85ae9cc970"} Dec 01 09:31:47 crc kubenswrapper[4867]: I1201 09:31:47.770925 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3c1d2a06-41e8-483e-8fe6-ce90496010bc","Type":"ContainerStarted","Data":"f9ebf71a00eb771b9a691bcf7a0545833fbf43c7e8e0cb2fd5613bb48f3ed990"} Dec 01 09:31:47 crc kubenswrapper[4867]: I1201 09:31:47.771127 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3c1d2a06-41e8-483e-8fe6-ce90496010bc","Type":"ContainerStarted","Data":"59472ee870adda222e3fc892490bc4056f5a1bb46ab2b923e47ffe8740082444"} Dec 01 09:31:47 crc kubenswrapper[4867]: I1201 09:31:47.796263 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.796238635 podStartE2EDuration="2.796238635s" podCreationTimestamp="2025-12-01 09:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:31:47.786471857 +0000 UTC m=+1429.245858611" watchObservedRunningTime="2025-12-01 09:31:47.796238635 +0000 UTC m=+1429.255625389" Dec 01 09:31:47 crc kubenswrapper[4867]: I1201 09:31:47.907187 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:31:47 crc kubenswrapper[4867]: I1201 09:31:47.907453 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" containerName="ceilometer-central-agent" containerID="cri-o://cf77186aa773bcab3bf2c79ec6ea40ad2f19edf5f3aaef1706e4acedad6b86ed" gracePeriod=30 Dec 01 09:31:47 crc kubenswrapper[4867]: I1201 09:31:47.907634 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" containerName="proxy-httpd" containerID="cri-o://da9bc2922244a660217e6317aaa2a120ab01c97bca5a859e5bde4da85234edc0" gracePeriod=30 Dec 01 09:31:47 crc kubenswrapper[4867]: I1201 09:31:47.907737 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" containerName="sg-core" containerID="cri-o://ad06385dd1bb98a0d533ddc771da744c4662aaa6f19f1c612fa6ca360bb51354" gracePeriod=30 Dec 01 09:31:47 crc kubenswrapper[4867]: I1201 09:31:47.907985 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" containerName="ceilometer-notification-agent" containerID="cri-o://64f000c48da56ecc2fc797eaa871bda89e39c52aed740d50d8b8f28fd88931d7" gracePeriod=30 Dec 01 09:31:48 crc kubenswrapper[4867]: I1201 09:31:48.784671 4867 generic.go:334] "Generic (PLEG): container finished" podID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" containerID="da9bc2922244a660217e6317aaa2a120ab01c97bca5a859e5bde4da85234edc0" exitCode=0 Dec 01 09:31:48 crc kubenswrapper[4867]: I1201 09:31:48.785015 4867 generic.go:334] "Generic (PLEG): container finished" podID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" containerID="ad06385dd1bb98a0d533ddc771da744c4662aaa6f19f1c612fa6ca360bb51354" exitCode=2 Dec 01 09:31:48 crc kubenswrapper[4867]: I1201 09:31:48.784749 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b12b9b87-3dcb-4b41-b66f-9cda88c6d921","Type":"ContainerDied","Data":"da9bc2922244a660217e6317aaa2a120ab01c97bca5a859e5bde4da85234edc0"} Dec 01 09:31:48 crc kubenswrapper[4867]: I1201 09:31:48.785058 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b12b9b87-3dcb-4b41-b66f-9cda88c6d921","Type":"ContainerDied","Data":"ad06385dd1bb98a0d533ddc771da744c4662aaa6f19f1c612fa6ca360bb51354"} Dec 01 09:31:48 crc kubenswrapper[4867]: I1201 09:31:48.786733 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3d78f955-151d-46a9-9ef3-183051c318e6","Type":"ContainerStarted","Data":"0b71580760db0639f1f246c870aaaebaefe331df4889a4eeeca160a014cce9f5"} Dec 01 09:31:48 crc kubenswrapper[4867]: I1201 09:31:48.811701 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.443179996 podStartE2EDuration="2.811679716s" podCreationTimestamp="2025-12-01 09:31:46 +0000 UTC" firstStartedPulling="2025-12-01 09:31:47.182213457 +0000 UTC m=+1428.641600211" lastFinishedPulling="2025-12-01 09:31:47.550713177 +0000 UTC m=+1429.010099931" observedRunningTime="2025-12-01 09:31:48.805965419 +0000 UTC m=+1430.265352173" watchObservedRunningTime="2025-12-01 09:31:48.811679716 +0000 UTC m=+1430.271066470" Dec 01 09:31:48 crc kubenswrapper[4867]: I1201 09:31:48.844573 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0cf63e8-e586-4cac-990d-43c976b30366" path="/var/lib/kubelet/pods/b0cf63e8-e586-4cac-990d-43c976b30366/volumes" Dec 01 09:31:49 crc kubenswrapper[4867]: I1201 09:31:49.510385 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:31:49 crc kubenswrapper[4867]: I1201 09:31:49.799955 4867 generic.go:334] "Generic (PLEG): container finished" podID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" containerID="cf77186aa773bcab3bf2c79ec6ea40ad2f19edf5f3aaef1706e4acedad6b86ed" exitCode=0 Dec 01 09:31:49 crc kubenswrapper[4867]: I1201 09:31:49.800036 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b12b9b87-3dcb-4b41-b66f-9cda88c6d921","Type":"ContainerDied","Data":"cf77186aa773bcab3bf2c79ec6ea40ad2f19edf5f3aaef1706e4acedad6b86ed"} Dec 01 09:31:49 crc kubenswrapper[4867]: I1201 09:31:49.800187 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 01 09:31:49 crc kubenswrapper[4867]: I1201 09:31:49.959276 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-d47c7cb76-srf4p" Dec 01 09:31:49 crc kubenswrapper[4867]: I1201 09:31:49.974897 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 01 09:31:50 crc kubenswrapper[4867]: I1201 09:31:50.066330 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-c846795f4-k7mlj"] Dec 01 09:31:50 crc kubenswrapper[4867]: I1201 09:31:50.066590 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-c846795f4-k7mlj" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon-log" containerID="cri-o://7d407d773adde4e56125d8367a5647a70061e41451fee16defd9704d33938a4f" gracePeriod=30 Dec 01 09:31:50 crc kubenswrapper[4867]: I1201 09:31:50.067103 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-c846795f4-k7mlj" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" containerID="cri-o://f8e464b7eb2b25f34de89419d024ce3de918ed146549c279457ce68b63041cf4" gracePeriod=30 Dec 01 09:31:51 crc kubenswrapper[4867]: I1201 09:31:51.158741 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 09:31:52 crc kubenswrapper[4867]: I1201 09:31:52.281274 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 09:31:52 crc kubenswrapper[4867]: I1201 09:31:52.281728 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 09:31:52 crc kubenswrapper[4867]: I1201 09:31:52.842929 4867 generic.go:334] "Generic (PLEG): container finished" podID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" containerID="64f000c48da56ecc2fc797eaa871bda89e39c52aed740d50d8b8f28fd88931d7" exitCode=0 Dec 01 09:31:52 crc kubenswrapper[4867]: I1201 09:31:52.842977 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b12b9b87-3dcb-4b41-b66f-9cda88c6d921","Type":"ContainerDied","Data":"64f000c48da56ecc2fc797eaa871bda89e39c52aed740d50d8b8f28fd88931d7"} Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.280555 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-c846795f4-k7mlj" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:50174->10.217.0.145:8443: read: connection reset by peer" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.301152 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="80eeedcc-8486-4395-a708-2fb4b89945d4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.301182 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="80eeedcc-8486-4395-a708-2fb4b89945d4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.308190 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.510534 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-config-data\") pod \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.510633 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-combined-ca-bundle\") pod \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.510686 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-scripts\") pod \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.511230 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-log-httpd\") pod \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.511317 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-sg-core-conf-yaml\") pod \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.511364 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-run-httpd\") pod \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.511444 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4m7w\" (UniqueName: \"kubernetes.io/projected/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-kube-api-access-p4m7w\") pod \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\" (UID: \"b12b9b87-3dcb-4b41-b66f-9cda88c6d921\") " Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.511608 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b12b9b87-3dcb-4b41-b66f-9cda88c6d921" (UID: "b12b9b87-3dcb-4b41-b66f-9cda88c6d921"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.511767 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b12b9b87-3dcb-4b41-b66f-9cda88c6d921" (UID: "b12b9b87-3dcb-4b41-b66f-9cda88c6d921"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.512984 4867 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.513417 4867 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.531206 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-kube-api-access-p4m7w" (OuterVolumeSpecName: "kube-api-access-p4m7w") pod "b12b9b87-3dcb-4b41-b66f-9cda88c6d921" (UID: "b12b9b87-3dcb-4b41-b66f-9cda88c6d921"). InnerVolumeSpecName "kube-api-access-p4m7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.539135 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-scripts" (OuterVolumeSpecName: "scripts") pod "b12b9b87-3dcb-4b41-b66f-9cda88c6d921" (UID: "b12b9b87-3dcb-4b41-b66f-9cda88c6d921"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.612255 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b12b9b87-3dcb-4b41-b66f-9cda88c6d921" (UID: "b12b9b87-3dcb-4b41-b66f-9cda88c6d921"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.626313 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4m7w\" (UniqueName: \"kubernetes.io/projected/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-kube-api-access-p4m7w\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.626390 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.626400 4867 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.633539 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b12b9b87-3dcb-4b41-b66f-9cda88c6d921" (UID: "b12b9b87-3dcb-4b41-b66f-9cda88c6d921"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.693252 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-config-data" (OuterVolumeSpecName: "config-data") pod "b12b9b87-3dcb-4b41-b66f-9cda88c6d921" (UID: "b12b9b87-3dcb-4b41-b66f-9cda88c6d921"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.728153 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.728193 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12b9b87-3dcb-4b41-b66f-9cda88c6d921-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.856254 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b12b9b87-3dcb-4b41-b66f-9cda88c6d921","Type":"ContainerDied","Data":"e133893098e3230a737ddfab4733a36f6c1046070857ac121591b26b7c993bcb"} Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.856283 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.856310 4867 scope.go:117] "RemoveContainer" containerID="da9bc2922244a660217e6317aaa2a120ab01c97bca5a859e5bde4da85234edc0" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.871796 4867 generic.go:334] "Generic (PLEG): container finished" podID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerID="f8e464b7eb2b25f34de89419d024ce3de918ed146549c279457ce68b63041cf4" exitCode=0 Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.871871 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c846795f4-k7mlj" event={"ID":"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e","Type":"ContainerDied","Data":"f8e464b7eb2b25f34de89419d024ce3de918ed146549c279457ce68b63041cf4"} Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.896506 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.904518 4867 scope.go:117] "RemoveContainer" containerID="ad06385dd1bb98a0d533ddc771da744c4662aaa6f19f1c612fa6ca360bb51354" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.921913 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.943580 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.945114 4867 scope.go:117] "RemoveContainer" containerID="64f000c48da56ecc2fc797eaa871bda89e39c52aed740d50d8b8f28fd88931d7" Dec 01 09:31:53 crc kubenswrapper[4867]: E1201 09:31:53.946261 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" containerName="sg-core" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.946285 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" containerName="sg-core" Dec 01 09:31:53 crc kubenswrapper[4867]: E1201 09:31:53.946299 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cf63e8-e586-4cac-990d-43c976b30366" containerName="extract-utilities" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.946308 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cf63e8-e586-4cac-990d-43c976b30366" containerName="extract-utilities" Dec 01 09:31:53 crc kubenswrapper[4867]: E1201 09:31:53.946331 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" containerName="proxy-httpd" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.946341 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" containerName="proxy-httpd" Dec 01 09:31:53 crc kubenswrapper[4867]: E1201 09:31:53.946356 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" containerName="ceilometer-notification-agent" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.946364 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" containerName="ceilometer-notification-agent" Dec 01 09:31:53 crc kubenswrapper[4867]: E1201 09:31:53.946372 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cf63e8-e586-4cac-990d-43c976b30366" containerName="extract-content" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.946389 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cf63e8-e586-4cac-990d-43c976b30366" containerName="extract-content" Dec 01 09:31:53 crc kubenswrapper[4867]: E1201 09:31:53.946405 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" containerName="ceilometer-central-agent" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.946412 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" containerName="ceilometer-central-agent" Dec 01 09:31:53 crc kubenswrapper[4867]: E1201 09:31:53.946436 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cf63e8-e586-4cac-990d-43c976b30366" containerName="registry-server" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.946443 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cf63e8-e586-4cac-990d-43c976b30366" containerName="registry-server" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.946658 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" containerName="sg-core" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.946677 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cf63e8-e586-4cac-990d-43c976b30366" containerName="registry-server" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.946695 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" containerName="ceilometer-central-agent" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.946718 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" containerName="proxy-httpd" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.946732 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" containerName="ceilometer-notification-agent" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.949143 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.981468 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.986541 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 09:31:53 crc kubenswrapper[4867]: I1201 09:31:53.989491 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.006592 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.017551 4867 scope.go:117] "RemoveContainer" containerID="cf77186aa773bcab3bf2c79ec6ea40ad2f19edf5f3aaef1706e4acedad6b86ed" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.035496 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-config-data\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.035841 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-scripts\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.035971 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10932a09-ea39-4e72-808d-d56122867d2a-run-httpd\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.036126 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.036303 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10932a09-ea39-4e72-808d-d56122867d2a-log-httpd\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.036416 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.036529 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnrdw\" (UniqueName: \"kubernetes.io/projected/10932a09-ea39-4e72-808d-d56122867d2a-kube-api-access-fnrdw\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.036658 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.051636 4867 scope.go:117] "RemoveContainer" containerID="c7ec8779b58f97fafd5134c7e65888047b97ae6e37afffbfa008a76f648c7186" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.140660 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-config-data\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.140697 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-scripts\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.140745 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10932a09-ea39-4e72-808d-d56122867d2a-run-httpd\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.140818 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.140837 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10932a09-ea39-4e72-808d-d56122867d2a-log-httpd\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.140853 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.140877 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnrdw\" (UniqueName: \"kubernetes.io/projected/10932a09-ea39-4e72-808d-d56122867d2a-kube-api-access-fnrdw\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.141318 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.142255 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10932a09-ea39-4e72-808d-d56122867d2a-log-httpd\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.142624 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10932a09-ea39-4e72-808d-d56122867d2a-run-httpd\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.150362 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.150850 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-config-data\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.151436 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-scripts\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.152369 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.154683 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.160160 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnrdw\" (UniqueName: \"kubernetes.io/projected/10932a09-ea39-4e72-808d-d56122867d2a-kube-api-access-fnrdw\") pod \"ceilometer-0\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.305344 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.844941 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b12b9b87-3dcb-4b41-b66f-9cda88c6d921" path="/var/lib/kubelet/pods/b12b9b87-3dcb-4b41-b66f-9cda88c6d921/volumes" Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.853743 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:31:54 crc kubenswrapper[4867]: I1201 09:31:54.887705 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10932a09-ea39-4e72-808d-d56122867d2a","Type":"ContainerStarted","Data":"44d5ec80b0a241d81a5ff1cb3df800f3d931503944898827fb5e6bfeb490080f"} Dec 01 09:31:55 crc kubenswrapper[4867]: I1201 09:31:55.093708 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 09:31:55 crc kubenswrapper[4867]: I1201 09:31:55.095532 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 09:31:55 crc kubenswrapper[4867]: I1201 09:31:55.896769 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10932a09-ea39-4e72-808d-d56122867d2a","Type":"ContainerStarted","Data":"84915af6eafda3e2a49022b05dff200a1e8c6a021bcbf78d3b23d28bf426c760"} Dec 01 09:31:56 crc kubenswrapper[4867]: I1201 09:31:56.158917 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 09:31:56 crc kubenswrapper[4867]: I1201 09:31:56.177073 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2e748b11-e821-47c1-850b-e9812ec6bafd" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:31:56 crc kubenswrapper[4867]: I1201 09:31:56.177390 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2e748b11-e821-47c1-850b-e9812ec6bafd" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:31:56 crc kubenswrapper[4867]: I1201 09:31:56.219479 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 09:31:56 crc kubenswrapper[4867]: I1201 09:31:56.642333 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 01 09:31:56 crc kubenswrapper[4867]: I1201 09:31:56.908080 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10932a09-ea39-4e72-808d-d56122867d2a","Type":"ContainerStarted","Data":"0c5f1fce49b7d235eb2c663c45e8502c1355e005f213767a98e8fb6b6fb551dd"} Dec 01 09:31:56 crc kubenswrapper[4867]: I1201 09:31:56.943562 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 09:31:58 crc kubenswrapper[4867]: I1201 09:31:58.927891 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10932a09-ea39-4e72-808d-d56122867d2a","Type":"ContainerStarted","Data":"bc4c053fa3322961c6a417677e4fa3e7640e231458021436ad7bfc1802013052"} Dec 01 09:32:00 crc kubenswrapper[4867]: I1201 09:32:00.961957 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10932a09-ea39-4e72-808d-d56122867d2a","Type":"ContainerStarted","Data":"94ef95f022027bc8b0d1c2914b140376701eae047cbfc7d292d4d8d23ffc4c76"} Dec 01 09:32:00 crc kubenswrapper[4867]: I1201 09:32:00.962565 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 09:32:01 crc kubenswrapper[4867]: I1201 09:32:01.028215 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.110290052 podStartE2EDuration="8.028186469s" podCreationTimestamp="2025-12-01 09:31:53 +0000 UTC" firstStartedPulling="2025-12-01 09:31:54.871924363 +0000 UTC m=+1436.331311117" lastFinishedPulling="2025-12-01 09:31:59.78982077 +0000 UTC m=+1441.249207534" observedRunningTime="2025-12-01 09:32:00.99511524 +0000 UTC m=+1442.454502014" watchObservedRunningTime="2025-12-01 09:32:01.028186469 +0000 UTC m=+1442.487573233" Dec 01 09:32:01 crc kubenswrapper[4867]: I1201 09:32:01.754833 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:01 crc kubenswrapper[4867]: I1201 09:32:01.802955 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jw7z\" (UniqueName: \"kubernetes.io/projected/e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e-kube-api-access-9jw7z\") pod \"e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e\" (UID: \"e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e\") " Dec 01 09:32:01 crc kubenswrapper[4867]: I1201 09:32:01.803047 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e-config-data\") pod \"e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e\" (UID: \"e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e\") " Dec 01 09:32:01 crc kubenswrapper[4867]: I1201 09:32:01.803256 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e-combined-ca-bundle\") pod \"e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e\" (UID: \"e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e\") " Dec 01 09:32:01 crc kubenswrapper[4867]: I1201 09:32:01.822499 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e-kube-api-access-9jw7z" (OuterVolumeSpecName: "kube-api-access-9jw7z") pod "e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e" (UID: "e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e"). InnerVolumeSpecName "kube-api-access-9jw7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:32:01 crc kubenswrapper[4867]: I1201 09:32:01.850976 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e-config-data" (OuterVolumeSpecName: "config-data") pod "e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e" (UID: "e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:01 crc kubenswrapper[4867]: I1201 09:32:01.884024 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e" (UID: "e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:01 crc kubenswrapper[4867]: I1201 09:32:01.909379 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:01 crc kubenswrapper[4867]: I1201 09:32:01.909670 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jw7z\" (UniqueName: \"kubernetes.io/projected/e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e-kube-api-access-9jw7z\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:01 crc kubenswrapper[4867]: I1201 09:32:01.909760 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:01 crc kubenswrapper[4867]: I1201 09:32:01.976464 4867 generic.go:334] "Generic (PLEG): container finished" podID="e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e" containerID="406e873ec09ab9d6606a78dbdbaf0c24aeca9d702366856d72f817972bbcfcec" exitCode=137 Dec 01 09:32:01 crc kubenswrapper[4867]: I1201 09:32:01.977419 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:01 crc kubenswrapper[4867]: I1201 09:32:01.978156 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e","Type":"ContainerDied","Data":"406e873ec09ab9d6606a78dbdbaf0c24aeca9d702366856d72f817972bbcfcec"} Dec 01 09:32:01 crc kubenswrapper[4867]: I1201 09:32:01.978197 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e","Type":"ContainerDied","Data":"9cfbacc64023085117a2c6121b215ff83baadc38e3d7bf45959a3aeceeb20074"} Dec 01 09:32:01 crc kubenswrapper[4867]: I1201 09:32:01.978217 4867 scope.go:117] "RemoveContainer" containerID="406e873ec09ab9d6606a78dbdbaf0c24aeca9d702366856d72f817972bbcfcec" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.023865 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.037979 4867 scope.go:117] "RemoveContainer" containerID="406e873ec09ab9d6606a78dbdbaf0c24aeca9d702366856d72f817972bbcfcec" Dec 01 09:32:02 crc kubenswrapper[4867]: E1201 09:32:02.043374 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"406e873ec09ab9d6606a78dbdbaf0c24aeca9d702366856d72f817972bbcfcec\": container with ID starting with 406e873ec09ab9d6606a78dbdbaf0c24aeca9d702366856d72f817972bbcfcec not found: ID does not exist" containerID="406e873ec09ab9d6606a78dbdbaf0c24aeca9d702366856d72f817972bbcfcec" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.043423 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"406e873ec09ab9d6606a78dbdbaf0c24aeca9d702366856d72f817972bbcfcec"} err="failed to get container status \"406e873ec09ab9d6606a78dbdbaf0c24aeca9d702366856d72f817972bbcfcec\": rpc error: code = NotFound desc = could not find container \"406e873ec09ab9d6606a78dbdbaf0c24aeca9d702366856d72f817972bbcfcec\": container with ID starting with 406e873ec09ab9d6606a78dbdbaf0c24aeca9d702366856d72f817972bbcfcec not found: ID does not exist" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.043953 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.057444 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 09:32:02 crc kubenswrapper[4867]: E1201 09:32:02.058045 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.058066 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.058325 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.059148 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.064633 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.064771 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.064863 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.088192 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.113330 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d178f07b-43d0-48ea-a5fe-898f68e80850-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d178f07b-43d0-48ea-a5fe-898f68e80850\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.113382 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d178f07b-43d0-48ea-a5fe-898f68e80850-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d178f07b-43d0-48ea-a5fe-898f68e80850\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.113444 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d178f07b-43d0-48ea-a5fe-898f68e80850-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d178f07b-43d0-48ea-a5fe-898f68e80850\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.113520 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d178f07b-43d0-48ea-a5fe-898f68e80850-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d178f07b-43d0-48ea-a5fe-898f68e80850\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.113591 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l6fm\" (UniqueName: \"kubernetes.io/projected/d178f07b-43d0-48ea-a5fe-898f68e80850-kube-api-access-8l6fm\") pod \"nova-cell1-novncproxy-0\" (UID: \"d178f07b-43d0-48ea-a5fe-898f68e80850\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.215458 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d178f07b-43d0-48ea-a5fe-898f68e80850-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d178f07b-43d0-48ea-a5fe-898f68e80850\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.215506 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d178f07b-43d0-48ea-a5fe-898f68e80850-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d178f07b-43d0-48ea-a5fe-898f68e80850\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.215553 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d178f07b-43d0-48ea-a5fe-898f68e80850-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d178f07b-43d0-48ea-a5fe-898f68e80850\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.215608 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d178f07b-43d0-48ea-a5fe-898f68e80850-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d178f07b-43d0-48ea-a5fe-898f68e80850\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.215659 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l6fm\" (UniqueName: \"kubernetes.io/projected/d178f07b-43d0-48ea-a5fe-898f68e80850-kube-api-access-8l6fm\") pod \"nova-cell1-novncproxy-0\" (UID: \"d178f07b-43d0-48ea-a5fe-898f68e80850\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.220931 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d178f07b-43d0-48ea-a5fe-898f68e80850-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d178f07b-43d0-48ea-a5fe-898f68e80850\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.221027 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d178f07b-43d0-48ea-a5fe-898f68e80850-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d178f07b-43d0-48ea-a5fe-898f68e80850\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.221651 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d178f07b-43d0-48ea-a5fe-898f68e80850-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d178f07b-43d0-48ea-a5fe-898f68e80850\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.222615 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d178f07b-43d0-48ea-a5fe-898f68e80850-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d178f07b-43d0-48ea-a5fe-898f68e80850\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.238580 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l6fm\" (UniqueName: \"kubernetes.io/projected/d178f07b-43d0-48ea-a5fe-898f68e80850-kube-api-access-8l6fm\") pod \"nova-cell1-novncproxy-0\" (UID: \"d178f07b-43d0-48ea-a5fe-898f68e80850\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.287644 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.288706 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.299893 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.399837 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.837339 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e" path="/var/lib/kubelet/pods/e07b9026-5ff9-4f0e-a1a0-bf4636c51c7e/volumes" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.888741 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 09:32:02 crc kubenswrapper[4867]: W1201 09:32:02.891753 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd178f07b_43d0_48ea_a5fe_898f68e80850.slice/crio-d7bfe72184ce18e006e619371a12c527670fe486c3e1523fdd7a7c0f703edd82 WatchSource:0}: Error finding container d7bfe72184ce18e006e619371a12c527670fe486c3e1523fdd7a7c0f703edd82: Status 404 returned error can't find the container with id d7bfe72184ce18e006e619371a12c527670fe486c3e1523fdd7a7c0f703edd82 Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.906155 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-c846795f4-k7mlj" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Dec 01 09:32:02 crc kubenswrapper[4867]: I1201 09:32:02.994306 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d178f07b-43d0-48ea-a5fe-898f68e80850","Type":"ContainerStarted","Data":"d7bfe72184ce18e006e619371a12c527670fe486c3e1523fdd7a7c0f703edd82"} Dec 01 09:32:03 crc kubenswrapper[4867]: I1201 09:32:03.004302 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 09:32:04 crc kubenswrapper[4867]: I1201 09:32:04.023616 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d178f07b-43d0-48ea-a5fe-898f68e80850","Type":"ContainerStarted","Data":"08ac54de3090bd93c2885a436eb7745002c073536857b1b068101b594bd0ab70"} Dec 01 09:32:04 crc kubenswrapper[4867]: I1201 09:32:04.045860 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.045778443 podStartE2EDuration="2.045778443s" podCreationTimestamp="2025-12-01 09:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:32:04.042327418 +0000 UTC m=+1445.501714172" watchObservedRunningTime="2025-12-01 09:32:04.045778443 +0000 UTC m=+1445.505165197" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.097252 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.100440 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.100730 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.100749 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.105278 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.107959 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.325343 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-r494r"] Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.327005 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.348784 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-r494r"] Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.501662 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-config\") pod \"dnsmasq-dns-89c5cd4d5-r494r\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.502130 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-r494r\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.502181 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-r494r\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.502632 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss6kg\" (UniqueName: \"kubernetes.io/projected/b5292749-115d-4d53-9ad5-0000a87fe88d-kube-api-access-ss6kg\") pod \"dnsmasq-dns-89c5cd4d5-r494r\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.502677 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-r494r\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.502713 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-r494r\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.604681 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-r494r\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.605031 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-r494r\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.605146 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss6kg\" (UniqueName: \"kubernetes.io/projected/b5292749-115d-4d53-9ad5-0000a87fe88d-kube-api-access-ss6kg\") pod \"dnsmasq-dns-89c5cd4d5-r494r\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.605257 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-r494r\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.605368 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-r494r\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.605524 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-config\") pod \"dnsmasq-dns-89c5cd4d5-r494r\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.605660 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-r494r\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.606198 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-r494r\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.606232 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-r494r\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.606327 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-r494r\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.606472 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-config\") pod \"dnsmasq-dns-89c5cd4d5-r494r\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.631734 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss6kg\" (UniqueName: \"kubernetes.io/projected/b5292749-115d-4d53-9ad5-0000a87fe88d-kube-api-access-ss6kg\") pod \"dnsmasq-dns-89c5cd4d5-r494r\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:05 crc kubenswrapper[4867]: I1201 09:32:05.660302 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:06 crc kubenswrapper[4867]: I1201 09:32:06.232493 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-r494r"] Dec 01 09:32:07 crc kubenswrapper[4867]: I1201 09:32:07.051717 4867 generic.go:334] "Generic (PLEG): container finished" podID="b5292749-115d-4d53-9ad5-0000a87fe88d" containerID="1d8f37f0f4cd6291a02aedb3f982cbda3af39f0ea24ec90d1422d2a9c67e4519" exitCode=0 Dec 01 09:32:07 crc kubenswrapper[4867]: I1201 09:32:07.051774 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" event={"ID":"b5292749-115d-4d53-9ad5-0000a87fe88d","Type":"ContainerDied","Data":"1d8f37f0f4cd6291a02aedb3f982cbda3af39f0ea24ec90d1422d2a9c67e4519"} Dec 01 09:32:07 crc kubenswrapper[4867]: I1201 09:32:07.052252 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" event={"ID":"b5292749-115d-4d53-9ad5-0000a87fe88d","Type":"ContainerStarted","Data":"68b2a6d8c734904ecc015201324dd223688b2031b774cbf458f66ce06b9c408c"} Dec 01 09:32:07 crc kubenswrapper[4867]: I1201 09:32:07.400369 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:08 crc kubenswrapper[4867]: I1201 09:32:08.064409 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" event={"ID":"b5292749-115d-4d53-9ad5-0000a87fe88d","Type":"ContainerStarted","Data":"2b607714a8af1eda235fb550bae5694a67bd3fab958d15c3fdbfd68906d2a5af"} Dec 01 09:32:08 crc kubenswrapper[4867]: I1201 09:32:08.064931 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:08 crc kubenswrapper[4867]: I1201 09:32:08.094547 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" podStartSLOduration=3.09452667 podStartE2EDuration="3.09452667s" podCreationTimestamp="2025-12-01 09:32:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:32:08.083713072 +0000 UTC m=+1449.543099826" watchObservedRunningTime="2025-12-01 09:32:08.09452667 +0000 UTC m=+1449.553913424" Dec 01 09:32:08 crc kubenswrapper[4867]: I1201 09:32:08.594216 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:32:08 crc kubenswrapper[4867]: I1201 09:32:08.594496 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2e748b11-e821-47c1-850b-e9812ec6bafd" containerName="nova-api-log" containerID="cri-o://556309ab756db3ad47a82037ddad0b526638f0aa86226cd781b2196ad0bfc152" gracePeriod=30 Dec 01 09:32:08 crc kubenswrapper[4867]: I1201 09:32:08.594572 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2e748b11-e821-47c1-850b-e9812ec6bafd" containerName="nova-api-api" containerID="cri-o://4a74d24d68d5d840b8cb2237dbb0e3b14d9428896889eef774977cac0fd7e82c" gracePeriod=30 Dec 01 09:32:08 crc kubenswrapper[4867]: I1201 09:32:08.946159 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:32:08 crc kubenswrapper[4867]: I1201 09:32:08.946851 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="10932a09-ea39-4e72-808d-d56122867d2a" containerName="ceilometer-central-agent" containerID="cri-o://84915af6eafda3e2a49022b05dff200a1e8c6a021bcbf78d3b23d28bf426c760" gracePeriod=30 Dec 01 09:32:08 crc kubenswrapper[4867]: I1201 09:32:08.946936 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="10932a09-ea39-4e72-808d-d56122867d2a" containerName="proxy-httpd" containerID="cri-o://94ef95f022027bc8b0d1c2914b140376701eae047cbfc7d292d4d8d23ffc4c76" gracePeriod=30 Dec 01 09:32:08 crc kubenswrapper[4867]: I1201 09:32:08.946978 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="10932a09-ea39-4e72-808d-d56122867d2a" containerName="sg-core" containerID="cri-o://bc4c053fa3322961c6a417677e4fa3e7640e231458021436ad7bfc1802013052" gracePeriod=30 Dec 01 09:32:08 crc kubenswrapper[4867]: I1201 09:32:08.946992 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="10932a09-ea39-4e72-808d-d56122867d2a" containerName="ceilometer-notification-agent" containerID="cri-o://0c5f1fce49b7d235eb2c663c45e8502c1355e005f213767a98e8fb6b6fb551dd" gracePeriod=30 Dec 01 09:32:09 crc kubenswrapper[4867]: I1201 09:32:09.078096 4867 generic.go:334] "Generic (PLEG): container finished" podID="10932a09-ea39-4e72-808d-d56122867d2a" containerID="bc4c053fa3322961c6a417677e4fa3e7640e231458021436ad7bfc1802013052" exitCode=2 Dec 01 09:32:09 crc kubenswrapper[4867]: I1201 09:32:09.078153 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10932a09-ea39-4e72-808d-d56122867d2a","Type":"ContainerDied","Data":"bc4c053fa3322961c6a417677e4fa3e7640e231458021436ad7bfc1802013052"} Dec 01 09:32:09 crc kubenswrapper[4867]: I1201 09:32:09.080069 4867 generic.go:334] "Generic (PLEG): container finished" podID="2e748b11-e821-47c1-850b-e9812ec6bafd" containerID="556309ab756db3ad47a82037ddad0b526638f0aa86226cd781b2196ad0bfc152" exitCode=143 Dec 01 09:32:09 crc kubenswrapper[4867]: I1201 09:32:09.080146 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e748b11-e821-47c1-850b-e9812ec6bafd","Type":"ContainerDied","Data":"556309ab756db3ad47a82037ddad0b526638f0aa86226cd781b2196ad0bfc152"} Dec 01 09:32:10 crc kubenswrapper[4867]: I1201 09:32:10.090794 4867 generic.go:334] "Generic (PLEG): container finished" podID="10932a09-ea39-4e72-808d-d56122867d2a" containerID="94ef95f022027bc8b0d1c2914b140376701eae047cbfc7d292d4d8d23ffc4c76" exitCode=0 Dec 01 09:32:10 crc kubenswrapper[4867]: I1201 09:32:10.090844 4867 generic.go:334] "Generic (PLEG): container finished" podID="10932a09-ea39-4e72-808d-d56122867d2a" containerID="84915af6eafda3e2a49022b05dff200a1e8c6a021bcbf78d3b23d28bf426c760" exitCode=0 Dec 01 09:32:10 crc kubenswrapper[4867]: I1201 09:32:10.090838 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10932a09-ea39-4e72-808d-d56122867d2a","Type":"ContainerDied","Data":"94ef95f022027bc8b0d1c2914b140376701eae047cbfc7d292d4d8d23ffc4c76"} Dec 01 09:32:10 crc kubenswrapper[4867]: I1201 09:32:10.090877 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10932a09-ea39-4e72-808d-d56122867d2a","Type":"ContainerDied","Data":"84915af6eafda3e2a49022b05dff200a1e8c6a021bcbf78d3b23d28bf426c760"} Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.102295 4867 generic.go:334] "Generic (PLEG): container finished" podID="10932a09-ea39-4e72-808d-d56122867d2a" containerID="0c5f1fce49b7d235eb2c663c45e8502c1355e005f213767a98e8fb6b6fb551dd" exitCode=0 Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.102369 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10932a09-ea39-4e72-808d-d56122867d2a","Type":"ContainerDied","Data":"0c5f1fce49b7d235eb2c663c45e8502c1355e005f213767a98e8fb6b6fb551dd"} Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.102627 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10932a09-ea39-4e72-808d-d56122867d2a","Type":"ContainerDied","Data":"44d5ec80b0a241d81a5ff1cb3df800f3d931503944898827fb5e6bfeb490080f"} Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.102643 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44d5ec80b0a241d81a5ff1cb3df800f3d931503944898827fb5e6bfeb490080f" Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.140451 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.324923 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnrdw\" (UniqueName: \"kubernetes.io/projected/10932a09-ea39-4e72-808d-d56122867d2a-kube-api-access-fnrdw\") pod \"10932a09-ea39-4e72-808d-d56122867d2a\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.325309 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-scripts\") pod \"10932a09-ea39-4e72-808d-d56122867d2a\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.325357 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10932a09-ea39-4e72-808d-d56122867d2a-run-httpd\") pod \"10932a09-ea39-4e72-808d-d56122867d2a\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.325385 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-sg-core-conf-yaml\") pod \"10932a09-ea39-4e72-808d-d56122867d2a\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.325409 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-ceilometer-tls-certs\") pod \"10932a09-ea39-4e72-808d-d56122867d2a\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.325430 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10932a09-ea39-4e72-808d-d56122867d2a-log-httpd\") pod \"10932a09-ea39-4e72-808d-d56122867d2a\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.325461 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-config-data\") pod \"10932a09-ea39-4e72-808d-d56122867d2a\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.325542 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-combined-ca-bundle\") pod \"10932a09-ea39-4e72-808d-d56122867d2a\" (UID: \"10932a09-ea39-4e72-808d-d56122867d2a\") " Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.327562 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10932a09-ea39-4e72-808d-d56122867d2a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "10932a09-ea39-4e72-808d-d56122867d2a" (UID: "10932a09-ea39-4e72-808d-d56122867d2a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.327742 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10932a09-ea39-4e72-808d-d56122867d2a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "10932a09-ea39-4e72-808d-d56122867d2a" (UID: "10932a09-ea39-4e72-808d-d56122867d2a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.331979 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-scripts" (OuterVolumeSpecName: "scripts") pod "10932a09-ea39-4e72-808d-d56122867d2a" (UID: "10932a09-ea39-4e72-808d-d56122867d2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.340137 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10932a09-ea39-4e72-808d-d56122867d2a-kube-api-access-fnrdw" (OuterVolumeSpecName: "kube-api-access-fnrdw") pod "10932a09-ea39-4e72-808d-d56122867d2a" (UID: "10932a09-ea39-4e72-808d-d56122867d2a"). InnerVolumeSpecName "kube-api-access-fnrdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.363148 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "10932a09-ea39-4e72-808d-d56122867d2a" (UID: "10932a09-ea39-4e72-808d-d56122867d2a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.385267 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "10932a09-ea39-4e72-808d-d56122867d2a" (UID: "10932a09-ea39-4e72-808d-d56122867d2a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.427324 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10932a09-ea39-4e72-808d-d56122867d2a" (UID: "10932a09-ea39-4e72-808d-d56122867d2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.428915 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnrdw\" (UniqueName: \"kubernetes.io/projected/10932a09-ea39-4e72-808d-d56122867d2a-kube-api-access-fnrdw\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.428943 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.428956 4867 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10932a09-ea39-4e72-808d-d56122867d2a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.428970 4867 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.428982 4867 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.428992 4867 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10932a09-ea39-4e72-808d-d56122867d2a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.429272 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.454656 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-config-data" (OuterVolumeSpecName: "config-data") pod "10932a09-ea39-4e72-808d-d56122867d2a" (UID: "10932a09-ea39-4e72-808d-d56122867d2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:11 crc kubenswrapper[4867]: I1201 09:32:11.531136 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10932a09-ea39-4e72-808d-d56122867d2a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.114513 4867 generic.go:334] "Generic (PLEG): container finished" podID="2e748b11-e821-47c1-850b-e9812ec6bafd" containerID="4a74d24d68d5d840b8cb2237dbb0e3b14d9428896889eef774977cac0fd7e82c" exitCode=0 Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.114804 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.115519 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e748b11-e821-47c1-850b-e9812ec6bafd","Type":"ContainerDied","Data":"4a74d24d68d5d840b8cb2237dbb0e3b14d9428896889eef774977cac0fd7e82c"} Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.115571 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e748b11-e821-47c1-850b-e9812ec6bafd","Type":"ContainerDied","Data":"bb97567b879d5f2898501b769ae0af5f64a643b41f697a65e9d4360d5fa50f32"} Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.115584 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb97567b879d5f2898501b769ae0af5f64a643b41f697a65e9d4360d5fa50f32" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.171856 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.192790 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.205511 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.224672 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:32:12 crc kubenswrapper[4867]: E1201 09:32:12.225389 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10932a09-ea39-4e72-808d-d56122867d2a" containerName="ceilometer-notification-agent" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.225506 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="10932a09-ea39-4e72-808d-d56122867d2a" containerName="ceilometer-notification-agent" Dec 01 09:32:12 crc kubenswrapper[4867]: E1201 09:32:12.225591 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10932a09-ea39-4e72-808d-d56122867d2a" containerName="sg-core" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.225662 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="10932a09-ea39-4e72-808d-d56122867d2a" containerName="sg-core" Dec 01 09:32:12 crc kubenswrapper[4867]: E1201 09:32:12.225745 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e748b11-e821-47c1-850b-e9812ec6bafd" containerName="nova-api-log" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.225850 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e748b11-e821-47c1-850b-e9812ec6bafd" containerName="nova-api-log" Dec 01 09:32:12 crc kubenswrapper[4867]: E1201 09:32:12.225953 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10932a09-ea39-4e72-808d-d56122867d2a" containerName="ceilometer-central-agent" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.226028 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="10932a09-ea39-4e72-808d-d56122867d2a" containerName="ceilometer-central-agent" Dec 01 09:32:12 crc kubenswrapper[4867]: E1201 09:32:12.226119 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10932a09-ea39-4e72-808d-d56122867d2a" containerName="proxy-httpd" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.226197 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="10932a09-ea39-4e72-808d-d56122867d2a" containerName="proxy-httpd" Dec 01 09:32:12 crc kubenswrapper[4867]: E1201 09:32:12.226290 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e748b11-e821-47c1-850b-e9812ec6bafd" containerName="nova-api-api" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.226368 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e748b11-e821-47c1-850b-e9812ec6bafd" containerName="nova-api-api" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.226723 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e748b11-e821-47c1-850b-e9812ec6bafd" containerName="nova-api-api" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.226836 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="10932a09-ea39-4e72-808d-d56122867d2a" containerName="sg-core" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.226913 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="10932a09-ea39-4e72-808d-d56122867d2a" containerName="proxy-httpd" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.226999 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e748b11-e821-47c1-850b-e9812ec6bafd" containerName="nova-api-log" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.227090 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="10932a09-ea39-4e72-808d-d56122867d2a" containerName="ceilometer-notification-agent" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.227187 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="10932a09-ea39-4e72-808d-d56122867d2a" containerName="ceilometer-central-agent" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.229505 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.232335 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.232564 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.236075 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.272197 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.344558 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e748b11-e821-47c1-850b-e9812ec6bafd-logs\") pod \"2e748b11-e821-47c1-850b-e9812ec6bafd\" (UID: \"2e748b11-e821-47c1-850b-e9812ec6bafd\") " Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.344734 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e748b11-e821-47c1-850b-e9812ec6bafd-config-data\") pod \"2e748b11-e821-47c1-850b-e9812ec6bafd\" (UID: \"2e748b11-e821-47c1-850b-e9812ec6bafd\") " Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.344777 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e748b11-e821-47c1-850b-e9812ec6bafd-combined-ca-bundle\") pod \"2e748b11-e821-47c1-850b-e9812ec6bafd\" (UID: \"2e748b11-e821-47c1-850b-e9812ec6bafd\") " Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.344944 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgmhp\" (UniqueName: \"kubernetes.io/projected/2e748b11-e821-47c1-850b-e9812ec6bafd-kube-api-access-bgmhp\") pod \"2e748b11-e821-47c1-850b-e9812ec6bafd\" (UID: \"2e748b11-e821-47c1-850b-e9812ec6bafd\") " Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.345252 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2699b818-66ce-4531-9084-e599305630ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.345291 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2699b818-66ce-4531-9084-e599305630ed-config-data\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.345335 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e748b11-e821-47c1-850b-e9812ec6bafd-logs" (OuterVolumeSpecName: "logs") pod "2e748b11-e821-47c1-850b-e9812ec6bafd" (UID: "2e748b11-e821-47c1-850b-e9812ec6bafd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.345393 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2699b818-66ce-4531-9084-e599305630ed-run-httpd\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.345424 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2699b818-66ce-4531-9084-e599305630ed-scripts\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.345588 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2699b818-66ce-4531-9084-e599305630ed-log-httpd\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.345646 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2699b818-66ce-4531-9084-e599305630ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.345685 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfslw\" (UniqueName: \"kubernetes.io/projected/2699b818-66ce-4531-9084-e599305630ed-kube-api-access-qfslw\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.345735 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2699b818-66ce-4531-9084-e599305630ed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.345807 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e748b11-e821-47c1-850b-e9812ec6bafd-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.351193 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e748b11-e821-47c1-850b-e9812ec6bafd-kube-api-access-bgmhp" (OuterVolumeSpecName: "kube-api-access-bgmhp") pod "2e748b11-e821-47c1-850b-e9812ec6bafd" (UID: "2e748b11-e821-47c1-850b-e9812ec6bafd"). InnerVolumeSpecName "kube-api-access-bgmhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.401662 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e748b11-e821-47c1-850b-e9812ec6bafd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e748b11-e821-47c1-850b-e9812ec6bafd" (UID: "2e748b11-e821-47c1-850b-e9812ec6bafd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.401682 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.408594 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e748b11-e821-47c1-850b-e9812ec6bafd-config-data" (OuterVolumeSpecName: "config-data") pod "2e748b11-e821-47c1-850b-e9812ec6bafd" (UID: "2e748b11-e821-47c1-850b-e9812ec6bafd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.440226 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.448088 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2699b818-66ce-4531-9084-e599305630ed-run-httpd\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.448131 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2699b818-66ce-4531-9084-e599305630ed-scripts\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.448224 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2699b818-66ce-4531-9084-e599305630ed-log-httpd\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.448255 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2699b818-66ce-4531-9084-e599305630ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.448281 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfslw\" (UniqueName: \"kubernetes.io/projected/2699b818-66ce-4531-9084-e599305630ed-kube-api-access-qfslw\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.448310 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2699b818-66ce-4531-9084-e599305630ed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.448334 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2699b818-66ce-4531-9084-e599305630ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.448350 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2699b818-66ce-4531-9084-e599305630ed-config-data\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.448416 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e748b11-e821-47c1-850b-e9812ec6bafd-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.448426 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e748b11-e821-47c1-850b-e9812ec6bafd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.448436 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgmhp\" (UniqueName: \"kubernetes.io/projected/2e748b11-e821-47c1-850b-e9812ec6bafd-kube-api-access-bgmhp\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.450552 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2699b818-66ce-4531-9084-e599305630ed-log-httpd\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.450886 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2699b818-66ce-4531-9084-e599305630ed-run-httpd\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.452289 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2699b818-66ce-4531-9084-e599305630ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.453666 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2699b818-66ce-4531-9084-e599305630ed-scripts\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.455569 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2699b818-66ce-4531-9084-e599305630ed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.462540 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2699b818-66ce-4531-9084-e599305630ed-config-data\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.466746 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2699b818-66ce-4531-9084-e599305630ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.472873 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfslw\" (UniqueName: \"kubernetes.io/projected/2699b818-66ce-4531-9084-e599305630ed-kube-api-access-qfslw\") pod \"ceilometer-0\" (UID: \"2699b818-66ce-4531-9084-e599305630ed\") " pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.549332 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.843968 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10932a09-ea39-4e72-808d-d56122867d2a" path="/var/lib/kubelet/pods/10932a09-ea39-4e72-808d-d56122867d2a/volumes" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.907048 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-c846795f4-k7mlj" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Dec 01 09:32:12 crc kubenswrapper[4867]: I1201 09:32:12.907179 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:32:13 crc kubenswrapper[4867]: W1201 09:32:13.027071 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2699b818_66ce_4531_9084_e599305630ed.slice/crio-a0c156dc47016c3cc22b9bc516b6e35f448ab856c15f17bdf4422460c86b30ac WatchSource:0}: Error finding container a0c156dc47016c3cc22b9bc516b6e35f448ab856c15f17bdf4422460c86b30ac: Status 404 returned error can't find the container with id a0c156dc47016c3cc22b9bc516b6e35f448ab856c15f17bdf4422460c86b30ac Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.028087 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.133078 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.133284 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2699b818-66ce-4531-9084-e599305630ed","Type":"ContainerStarted","Data":"a0c156dc47016c3cc22b9bc516b6e35f448ab856c15f17bdf4422460c86b30ac"} Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.185129 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.207681 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.212992 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.265015 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.267049 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.272280 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.273416 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.273739 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.332044 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.389697 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-config-data\") pod \"nova-api-0\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.389793 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.389873 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34ac959b-554e-4193-b92b-606d335f19e9-logs\") pod \"nova-api-0\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.389930 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-public-tls-certs\") pod \"nova-api-0\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.390033 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.390125 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d6c7\" (UniqueName: \"kubernetes.io/projected/34ac959b-554e-4193-b92b-606d335f19e9-kube-api-access-4d6c7\") pod \"nova-api-0\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.463749 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-vkcwz"] Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.465412 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vkcwz" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.467884 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.473123 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.487140 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vkcwz"] Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.491955 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.492037 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d6c7\" (UniqueName: \"kubernetes.io/projected/34ac959b-554e-4193-b92b-606d335f19e9-kube-api-access-4d6c7\") pod \"nova-api-0\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.492083 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-config-data\") pod \"nova-api-0\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.492101 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.492138 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34ac959b-554e-4193-b92b-606d335f19e9-logs\") pod \"nova-api-0\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.492182 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-public-tls-certs\") pod \"nova-api-0\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.496027 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34ac959b-554e-4193-b92b-606d335f19e9-logs\") pod \"nova-api-0\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.499655 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-config-data\") pod \"nova-api-0\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.500500 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.512215 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.512689 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-public-tls-certs\") pod \"nova-api-0\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.518907 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d6c7\" (UniqueName: \"kubernetes.io/projected/34ac959b-554e-4193-b92b-606d335f19e9-kube-api-access-4d6c7\") pod \"nova-api-0\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.593978 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9qhn\" (UniqueName: \"kubernetes.io/projected/2f49a1bf-d1bd-4027-879c-e5d8b6081396-kube-api-access-k9qhn\") pod \"nova-cell1-cell-mapping-vkcwz\" (UID: \"2f49a1bf-d1bd-4027-879c-e5d8b6081396\") " pod="openstack/nova-cell1-cell-mapping-vkcwz" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.594077 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f49a1bf-d1bd-4027-879c-e5d8b6081396-config-data\") pod \"nova-cell1-cell-mapping-vkcwz\" (UID: \"2f49a1bf-d1bd-4027-879c-e5d8b6081396\") " pod="openstack/nova-cell1-cell-mapping-vkcwz" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.594130 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f49a1bf-d1bd-4027-879c-e5d8b6081396-scripts\") pod \"nova-cell1-cell-mapping-vkcwz\" (UID: \"2f49a1bf-d1bd-4027-879c-e5d8b6081396\") " pod="openstack/nova-cell1-cell-mapping-vkcwz" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.594210 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f49a1bf-d1bd-4027-879c-e5d8b6081396-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vkcwz\" (UID: \"2f49a1bf-d1bd-4027-879c-e5d8b6081396\") " pod="openstack/nova-cell1-cell-mapping-vkcwz" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.607979 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.695553 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f49a1bf-d1bd-4027-879c-e5d8b6081396-config-data\") pod \"nova-cell1-cell-mapping-vkcwz\" (UID: \"2f49a1bf-d1bd-4027-879c-e5d8b6081396\") " pod="openstack/nova-cell1-cell-mapping-vkcwz" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.695621 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f49a1bf-d1bd-4027-879c-e5d8b6081396-scripts\") pod \"nova-cell1-cell-mapping-vkcwz\" (UID: \"2f49a1bf-d1bd-4027-879c-e5d8b6081396\") " pod="openstack/nova-cell1-cell-mapping-vkcwz" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.695687 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f49a1bf-d1bd-4027-879c-e5d8b6081396-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vkcwz\" (UID: \"2f49a1bf-d1bd-4027-879c-e5d8b6081396\") " pod="openstack/nova-cell1-cell-mapping-vkcwz" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.695753 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9qhn\" (UniqueName: \"kubernetes.io/projected/2f49a1bf-d1bd-4027-879c-e5d8b6081396-kube-api-access-k9qhn\") pod \"nova-cell1-cell-mapping-vkcwz\" (UID: \"2f49a1bf-d1bd-4027-879c-e5d8b6081396\") " pod="openstack/nova-cell1-cell-mapping-vkcwz" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.700308 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f49a1bf-d1bd-4027-879c-e5d8b6081396-config-data\") pod \"nova-cell1-cell-mapping-vkcwz\" (UID: \"2f49a1bf-d1bd-4027-879c-e5d8b6081396\") " pod="openstack/nova-cell1-cell-mapping-vkcwz" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.701066 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f49a1bf-d1bd-4027-879c-e5d8b6081396-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vkcwz\" (UID: \"2f49a1bf-d1bd-4027-879c-e5d8b6081396\") " pod="openstack/nova-cell1-cell-mapping-vkcwz" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.701881 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f49a1bf-d1bd-4027-879c-e5d8b6081396-scripts\") pod \"nova-cell1-cell-mapping-vkcwz\" (UID: \"2f49a1bf-d1bd-4027-879c-e5d8b6081396\") " pod="openstack/nova-cell1-cell-mapping-vkcwz" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.775846 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9qhn\" (UniqueName: \"kubernetes.io/projected/2f49a1bf-d1bd-4027-879c-e5d8b6081396-kube-api-access-k9qhn\") pod \"nova-cell1-cell-mapping-vkcwz\" (UID: \"2f49a1bf-d1bd-4027-879c-e5d8b6081396\") " pod="openstack/nova-cell1-cell-mapping-vkcwz" Dec 01 09:32:13 crc kubenswrapper[4867]: I1201 09:32:13.785842 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vkcwz" Dec 01 09:32:15 crc kubenswrapper[4867]: I1201 09:32:14.225284 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:32:15 crc kubenswrapper[4867]: I1201 09:32:14.424047 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vkcwz"] Dec 01 09:32:15 crc kubenswrapper[4867]: W1201 09:32:14.424683 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f49a1bf_d1bd_4027_879c_e5d8b6081396.slice/crio-2d49ff662f2fa5cbec53ae1b7ba86c82c017441d9ed3b2d5db66e403e83fdedc WatchSource:0}: Error finding container 2d49ff662f2fa5cbec53ae1b7ba86c82c017441d9ed3b2d5db66e403e83fdedc: Status 404 returned error can't find the container with id 2d49ff662f2fa5cbec53ae1b7ba86c82c017441d9ed3b2d5db66e403e83fdedc Dec 01 09:32:15 crc kubenswrapper[4867]: I1201 09:32:14.848956 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e748b11-e821-47c1-850b-e9812ec6bafd" path="/var/lib/kubelet/pods/2e748b11-e821-47c1-850b-e9812ec6bafd/volumes" Dec 01 09:32:15 crc kubenswrapper[4867]: I1201 09:32:15.164854 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"34ac959b-554e-4193-b92b-606d335f19e9","Type":"ContainerStarted","Data":"fe7a3c3176525480b61ca9edbd489dfd5f63cc84f5e78765ce8a9a018d3f9f4e"} Dec 01 09:32:15 crc kubenswrapper[4867]: I1201 09:32:15.165139 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"34ac959b-554e-4193-b92b-606d335f19e9","Type":"ContainerStarted","Data":"aa434dc4f32e273be0a9b0e9292d4f8081db24c650020bf23cd0326c59595c72"} Dec 01 09:32:15 crc kubenswrapper[4867]: I1201 09:32:15.165158 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"34ac959b-554e-4193-b92b-606d335f19e9","Type":"ContainerStarted","Data":"b4c127bd2b01ecb76a686cfe38b2ed6b749ff3b9b6facb6f998ab1ab3da71556"} Dec 01 09:32:15 crc kubenswrapper[4867]: I1201 09:32:15.182577 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2699b818-66ce-4531-9084-e599305630ed","Type":"ContainerStarted","Data":"24b4cf4ac23ec176632ea3132b31e165f3bccba940b81bf06d4aa042997106f1"} Dec 01 09:32:15 crc kubenswrapper[4867]: I1201 09:32:15.200087 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vkcwz" event={"ID":"2f49a1bf-d1bd-4027-879c-e5d8b6081396","Type":"ContainerStarted","Data":"81eac3897506718b344a440218a58d4ea149d62d48ae6d9a6b6ba5946dc88ee9"} Dec 01 09:32:15 crc kubenswrapper[4867]: I1201 09:32:15.200404 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vkcwz" event={"ID":"2f49a1bf-d1bd-4027-879c-e5d8b6081396","Type":"ContainerStarted","Data":"2d49ff662f2fa5cbec53ae1b7ba86c82c017441d9ed3b2d5db66e403e83fdedc"} Dec 01 09:32:15 crc kubenswrapper[4867]: I1201 09:32:15.213505 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.213479868 podStartE2EDuration="2.213479868s" podCreationTimestamp="2025-12-01 09:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:32:15.191243867 +0000 UTC m=+1456.650630641" watchObservedRunningTime="2025-12-01 09:32:15.213479868 +0000 UTC m=+1456.672866622" Dec 01 09:32:15 crc kubenswrapper[4867]: I1201 09:32:15.240631 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-vkcwz" podStartSLOduration=2.240610153 podStartE2EDuration="2.240610153s" podCreationTimestamp="2025-12-01 09:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:32:15.237250321 +0000 UTC m=+1456.696637075" watchObservedRunningTime="2025-12-01 09:32:15.240610153 +0000 UTC m=+1456.699996907" Dec 01 09:32:15 crc kubenswrapper[4867]: I1201 09:32:15.663000 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:32:15 crc kubenswrapper[4867]: I1201 09:32:15.778665 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-slzgk"] Dec 01 09:32:15 crc kubenswrapper[4867]: I1201 09:32:15.778924 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-slzgk" podUID="ce614908-558e-45de-ac61-f095149fee19" containerName="dnsmasq-dns" containerID="cri-o://a0941972b92dae92cff4525420fc6089a42f6b0cceffb85f50f3a0f83eee7dab" gracePeriod=10 Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.237093 4867 generic.go:334] "Generic (PLEG): container finished" podID="ce614908-558e-45de-ac61-f095149fee19" containerID="a0941972b92dae92cff4525420fc6089a42f6b0cceffb85f50f3a0f83eee7dab" exitCode=0 Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.237276 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-slzgk" event={"ID":"ce614908-558e-45de-ac61-f095149fee19","Type":"ContainerDied","Data":"a0941972b92dae92cff4525420fc6089a42f6b0cceffb85f50f3a0f83eee7dab"} Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.250827 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2699b818-66ce-4531-9084-e599305630ed","Type":"ContainerStarted","Data":"c4779978157e8d0f42acbd831d5af7d3cef2ee01ef9327549f9c4ac93ebce99f"} Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.426741 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.599725 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-dns-swift-storage-0\") pod \"ce614908-558e-45de-ac61-f095149fee19\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.600169 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jc97\" (UniqueName: \"kubernetes.io/projected/ce614908-558e-45de-ac61-f095149fee19-kube-api-access-8jc97\") pod \"ce614908-558e-45de-ac61-f095149fee19\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.600194 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-ovsdbserver-sb\") pod \"ce614908-558e-45de-ac61-f095149fee19\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.600225 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-dns-svc\") pod \"ce614908-558e-45de-ac61-f095149fee19\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.600305 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-ovsdbserver-nb\") pod \"ce614908-558e-45de-ac61-f095149fee19\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.600409 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-config\") pod \"ce614908-558e-45de-ac61-f095149fee19\" (UID: \"ce614908-558e-45de-ac61-f095149fee19\") " Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.605734 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce614908-558e-45de-ac61-f095149fee19-kube-api-access-8jc97" (OuterVolumeSpecName: "kube-api-access-8jc97") pod "ce614908-558e-45de-ac61-f095149fee19" (UID: "ce614908-558e-45de-ac61-f095149fee19"). InnerVolumeSpecName "kube-api-access-8jc97". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.681160 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ce614908-558e-45de-ac61-f095149fee19" (UID: "ce614908-558e-45de-ac61-f095149fee19"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.684320 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce614908-558e-45de-ac61-f095149fee19" (UID: "ce614908-558e-45de-ac61-f095149fee19"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.695954 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce614908-558e-45de-ac61-f095149fee19" (UID: "ce614908-558e-45de-ac61-f095149fee19"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.700457 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce614908-558e-45de-ac61-f095149fee19" (UID: "ce614908-558e-45de-ac61-f095149fee19"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.702172 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jc97\" (UniqueName: \"kubernetes.io/projected/ce614908-558e-45de-ac61-f095149fee19-kube-api-access-8jc97\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.702198 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.702208 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.702217 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.702226 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.711274 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-config" (OuterVolumeSpecName: "config") pod "ce614908-558e-45de-ac61-f095149fee19" (UID: "ce614908-558e-45de-ac61-f095149fee19"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:32:16 crc kubenswrapper[4867]: I1201 09:32:16.803688 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce614908-558e-45de-ac61-f095149fee19-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:17 crc kubenswrapper[4867]: I1201 09:32:17.266587 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2699b818-66ce-4531-9084-e599305630ed","Type":"ContainerStarted","Data":"74449d986c3a3a47fe5f4723ef32fa9ada4ccbae35af0cb27fc6c01872ca2670"} Dec 01 09:32:17 crc kubenswrapper[4867]: I1201 09:32:17.268625 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-slzgk" event={"ID":"ce614908-558e-45de-ac61-f095149fee19","Type":"ContainerDied","Data":"aa79e1566426062e2a1ff93e1297e6aba8bb4a72927c298aab4a570d6796b018"} Dec 01 09:32:17 crc kubenswrapper[4867]: I1201 09:32:17.268672 4867 scope.go:117] "RemoveContainer" containerID="a0941972b92dae92cff4525420fc6089a42f6b0cceffb85f50f3a0f83eee7dab" Dec 01 09:32:17 crc kubenswrapper[4867]: I1201 09:32:17.268804 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-slzgk" Dec 01 09:32:17 crc kubenswrapper[4867]: I1201 09:32:17.292965 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-slzgk"] Dec 01 09:32:17 crc kubenswrapper[4867]: I1201 09:32:17.298075 4867 scope.go:117] "RemoveContainer" containerID="6f450b6a19142aa05ee93a7e928d8d391e4ce7caa724c0bc3b7f09a7e6341c91" Dec 01 09:32:17 crc kubenswrapper[4867]: I1201 09:32:17.305144 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-slzgk"] Dec 01 09:32:18 crc kubenswrapper[4867]: I1201 09:32:18.845705 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce614908-558e-45de-ac61-f095149fee19" path="/var/lib/kubelet/pods/ce614908-558e-45de-ac61-f095149fee19/volumes" Dec 01 09:32:19 crc kubenswrapper[4867]: I1201 09:32:19.310320 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2699b818-66ce-4531-9084-e599305630ed","Type":"ContainerStarted","Data":"97266ad2b5d91ec2986174c38836222910ec7fadf0ffcf36fc8eaaeb162049e9"} Dec 01 09:32:19 crc kubenswrapper[4867]: I1201 09:32:19.310628 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 09:32:19 crc kubenswrapper[4867]: I1201 09:32:19.340167 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.017087933 podStartE2EDuration="7.340148937s" podCreationTimestamp="2025-12-01 09:32:12 +0000 UTC" firstStartedPulling="2025-12-01 09:32:13.030308139 +0000 UTC m=+1454.489694893" lastFinishedPulling="2025-12-01 09:32:18.353369143 +0000 UTC m=+1459.812755897" observedRunningTime="2025-12-01 09:32:19.337466373 +0000 UTC m=+1460.796853147" watchObservedRunningTime="2025-12-01 09:32:19.340148937 +0000 UTC m=+1460.799535701" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.010107 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.197611 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpbwt\" (UniqueName: \"kubernetes.io/projected/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-kube-api-access-hpbwt\") pod \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.198002 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-horizon-tls-certs\") pod \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.198105 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-config-data\") pod \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.198185 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-scripts\") pod \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.198274 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-logs\") pod \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.198355 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-combined-ca-bundle\") pod \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.198490 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-horizon-secret-key\") pod \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\" (UID: \"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e\") " Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.199287 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-logs" (OuterVolumeSpecName: "logs") pod "e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" (UID: "e3ec81b7-2197-4dfb-8865-9414f0cdfc6e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.204498 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" (UID: "e3ec81b7-2197-4dfb-8865-9414f0cdfc6e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.205104 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-kube-api-access-hpbwt" (OuterVolumeSpecName: "kube-api-access-hpbwt") pod "e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" (UID: "e3ec81b7-2197-4dfb-8865-9414f0cdfc6e"). InnerVolumeSpecName "kube-api-access-hpbwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.228265 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-scripts" (OuterVolumeSpecName: "scripts") pod "e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" (UID: "e3ec81b7-2197-4dfb-8865-9414f0cdfc6e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.230619 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-config-data" (OuterVolumeSpecName: "config-data") pod "e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" (UID: "e3ec81b7-2197-4dfb-8865-9414f0cdfc6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.243262 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" (UID: "e3ec81b7-2197-4dfb-8865-9414f0cdfc6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.260572 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" (UID: "e3ec81b7-2197-4dfb-8865-9414f0cdfc6e"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.300222 4867 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.300368 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpbwt\" (UniqueName: \"kubernetes.io/projected/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-kube-api-access-hpbwt\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.300422 4867 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.300496 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.300545 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.300592 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.300638 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.338838 4867 generic.go:334] "Generic (PLEG): container finished" podID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerID="7d407d773adde4e56125d8367a5647a70061e41451fee16defd9704d33938a4f" exitCode=137 Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.338921 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c846795f4-k7mlj" event={"ID":"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e","Type":"ContainerDied","Data":"7d407d773adde4e56125d8367a5647a70061e41451fee16defd9704d33938a4f"} Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.338956 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c846795f4-k7mlj" event={"ID":"e3ec81b7-2197-4dfb-8865-9414f0cdfc6e","Type":"ContainerDied","Data":"d1ea359bdea40daddb2690ec637a7978a3695bd78dd5c8829fbfee213f06285c"} Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.338979 4867 scope.go:117] "RemoveContainer" containerID="f8e464b7eb2b25f34de89419d024ce3de918ed146549c279457ce68b63041cf4" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.339147 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c846795f4-k7mlj" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.344257 4867 generic.go:334] "Generic (PLEG): container finished" podID="2f49a1bf-d1bd-4027-879c-e5d8b6081396" containerID="81eac3897506718b344a440218a58d4ea149d62d48ae6d9a6b6ba5946dc88ee9" exitCode=0 Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.344296 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vkcwz" event={"ID":"2f49a1bf-d1bd-4027-879c-e5d8b6081396","Type":"ContainerDied","Data":"81eac3897506718b344a440218a58d4ea149d62d48ae6d9a6b6ba5946dc88ee9"} Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.393782 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-c846795f4-k7mlj"] Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.401995 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-c846795f4-k7mlj"] Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.539200 4867 scope.go:117] "RemoveContainer" containerID="7d407d773adde4e56125d8367a5647a70061e41451fee16defd9704d33938a4f" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.560790 4867 scope.go:117] "RemoveContainer" containerID="f8e464b7eb2b25f34de89419d024ce3de918ed146549c279457ce68b63041cf4" Dec 01 09:32:21 crc kubenswrapper[4867]: E1201 09:32:21.561612 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8e464b7eb2b25f34de89419d024ce3de918ed146549c279457ce68b63041cf4\": container with ID starting with f8e464b7eb2b25f34de89419d024ce3de918ed146549c279457ce68b63041cf4 not found: ID does not exist" containerID="f8e464b7eb2b25f34de89419d024ce3de918ed146549c279457ce68b63041cf4" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.561701 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8e464b7eb2b25f34de89419d024ce3de918ed146549c279457ce68b63041cf4"} err="failed to get container status \"f8e464b7eb2b25f34de89419d024ce3de918ed146549c279457ce68b63041cf4\": rpc error: code = NotFound desc = could not find container \"f8e464b7eb2b25f34de89419d024ce3de918ed146549c279457ce68b63041cf4\": container with ID starting with f8e464b7eb2b25f34de89419d024ce3de918ed146549c279457ce68b63041cf4 not found: ID does not exist" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.561732 4867 scope.go:117] "RemoveContainer" containerID="7d407d773adde4e56125d8367a5647a70061e41451fee16defd9704d33938a4f" Dec 01 09:32:21 crc kubenswrapper[4867]: E1201 09:32:21.562086 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d407d773adde4e56125d8367a5647a70061e41451fee16defd9704d33938a4f\": container with ID starting with 7d407d773adde4e56125d8367a5647a70061e41451fee16defd9704d33938a4f not found: ID does not exist" containerID="7d407d773adde4e56125d8367a5647a70061e41451fee16defd9704d33938a4f" Dec 01 09:32:21 crc kubenswrapper[4867]: I1201 09:32:21.562117 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d407d773adde4e56125d8367a5647a70061e41451fee16defd9704d33938a4f"} err="failed to get container status \"7d407d773adde4e56125d8367a5647a70061e41451fee16defd9704d33938a4f\": rpc error: code = NotFound desc = could not find container \"7d407d773adde4e56125d8367a5647a70061e41451fee16defd9704d33938a4f\": container with ID starting with 7d407d773adde4e56125d8367a5647a70061e41451fee16defd9704d33938a4f not found: ID does not exist" Dec 01 09:32:22 crc kubenswrapper[4867]: I1201 09:32:22.736116 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vkcwz" Dec 01 09:32:22 crc kubenswrapper[4867]: I1201 09:32:22.827601 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f49a1bf-d1bd-4027-879c-e5d8b6081396-scripts\") pod \"2f49a1bf-d1bd-4027-879c-e5d8b6081396\" (UID: \"2f49a1bf-d1bd-4027-879c-e5d8b6081396\") " Dec 01 09:32:22 crc kubenswrapper[4867]: I1201 09:32:22.827645 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f49a1bf-d1bd-4027-879c-e5d8b6081396-config-data\") pod \"2f49a1bf-d1bd-4027-879c-e5d8b6081396\" (UID: \"2f49a1bf-d1bd-4027-879c-e5d8b6081396\") " Dec 01 09:32:22 crc kubenswrapper[4867]: I1201 09:32:22.827729 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f49a1bf-d1bd-4027-879c-e5d8b6081396-combined-ca-bundle\") pod \"2f49a1bf-d1bd-4027-879c-e5d8b6081396\" (UID: \"2f49a1bf-d1bd-4027-879c-e5d8b6081396\") " Dec 01 09:32:22 crc kubenswrapper[4867]: I1201 09:32:22.827842 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9qhn\" (UniqueName: \"kubernetes.io/projected/2f49a1bf-d1bd-4027-879c-e5d8b6081396-kube-api-access-k9qhn\") pod \"2f49a1bf-d1bd-4027-879c-e5d8b6081396\" (UID: \"2f49a1bf-d1bd-4027-879c-e5d8b6081396\") " Dec 01 09:32:22 crc kubenswrapper[4867]: I1201 09:32:22.848865 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f49a1bf-d1bd-4027-879c-e5d8b6081396-kube-api-access-k9qhn" (OuterVolumeSpecName: "kube-api-access-k9qhn") pod "2f49a1bf-d1bd-4027-879c-e5d8b6081396" (UID: "2f49a1bf-d1bd-4027-879c-e5d8b6081396"). InnerVolumeSpecName "kube-api-access-k9qhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:32:22 crc kubenswrapper[4867]: I1201 09:32:22.849298 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f49a1bf-d1bd-4027-879c-e5d8b6081396-scripts" (OuterVolumeSpecName: "scripts") pod "2f49a1bf-d1bd-4027-879c-e5d8b6081396" (UID: "2f49a1bf-d1bd-4027-879c-e5d8b6081396"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:22 crc kubenswrapper[4867]: I1201 09:32:22.877925 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" path="/var/lib/kubelet/pods/e3ec81b7-2197-4dfb-8865-9414f0cdfc6e/volumes" Dec 01 09:32:22 crc kubenswrapper[4867]: I1201 09:32:22.878882 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f49a1bf-d1bd-4027-879c-e5d8b6081396-config-data" (OuterVolumeSpecName: "config-data") pod "2f49a1bf-d1bd-4027-879c-e5d8b6081396" (UID: "2f49a1bf-d1bd-4027-879c-e5d8b6081396"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:22 crc kubenswrapper[4867]: I1201 09:32:22.880685 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f49a1bf-d1bd-4027-879c-e5d8b6081396-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f49a1bf-d1bd-4027-879c-e5d8b6081396" (UID: "2f49a1bf-d1bd-4027-879c-e5d8b6081396"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:22 crc kubenswrapper[4867]: I1201 09:32:22.930912 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f49a1bf-d1bd-4027-879c-e5d8b6081396-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:22 crc kubenswrapper[4867]: I1201 09:32:22.930943 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9qhn\" (UniqueName: \"kubernetes.io/projected/2f49a1bf-d1bd-4027-879c-e5d8b6081396-kube-api-access-k9qhn\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:22 crc kubenswrapper[4867]: I1201 09:32:22.930956 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f49a1bf-d1bd-4027-879c-e5d8b6081396-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:22 crc kubenswrapper[4867]: I1201 09:32:22.930965 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f49a1bf-d1bd-4027-879c-e5d8b6081396-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:23 crc kubenswrapper[4867]: I1201 09:32:23.365785 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vkcwz" event={"ID":"2f49a1bf-d1bd-4027-879c-e5d8b6081396","Type":"ContainerDied","Data":"2d49ff662f2fa5cbec53ae1b7ba86c82c017441d9ed3b2d5db66e403e83fdedc"} Dec 01 09:32:23 crc kubenswrapper[4867]: I1201 09:32:23.366156 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d49ff662f2fa5cbec53ae1b7ba86c82c017441d9ed3b2d5db66e403e83fdedc" Dec 01 09:32:23 crc kubenswrapper[4867]: I1201 09:32:23.366070 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vkcwz" Dec 01 09:32:23 crc kubenswrapper[4867]: I1201 09:32:23.559979 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:32:23 crc kubenswrapper[4867]: I1201 09:32:23.560449 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="34ac959b-554e-4193-b92b-606d335f19e9" containerName="nova-api-log" containerID="cri-o://aa434dc4f32e273be0a9b0e9292d4f8081db24c650020bf23cd0326c59595c72" gracePeriod=30 Dec 01 09:32:23 crc kubenswrapper[4867]: I1201 09:32:23.560948 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="34ac959b-554e-4193-b92b-606d335f19e9" containerName="nova-api-api" containerID="cri-o://fe7a3c3176525480b61ca9edbd489dfd5f63cc84f5e78765ce8a9a018d3f9f4e" gracePeriod=30 Dec 01 09:32:23 crc kubenswrapper[4867]: I1201 09:32:23.575834 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:32:23 crc kubenswrapper[4867]: I1201 09:32:23.576099 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3c1d2a06-41e8-483e-8fe6-ce90496010bc" containerName="nova-scheduler-scheduler" containerID="cri-o://f9ebf71a00eb771b9a691bcf7a0545833fbf43c7e8e0cb2fd5613bb48f3ed990" gracePeriod=30 Dec 01 09:32:23 crc kubenswrapper[4867]: I1201 09:32:23.640568 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:32:23 crc kubenswrapper[4867]: I1201 09:32:23.640868 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="80eeedcc-8486-4395-a708-2fb4b89945d4" containerName="nova-metadata-log" containerID="cri-o://7da8d55b80bee370279a8cb0cab58eac6870e3802a895d79a05978f6112fb54d" gracePeriod=30 Dec 01 09:32:23 crc kubenswrapper[4867]: I1201 09:32:23.641066 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="80eeedcc-8486-4395-a708-2fb4b89945d4" containerName="nova-metadata-metadata" containerID="cri-o://6a3e8514ac7be518d49721c25fd01ea6ac55ff925d070872e28f3899e8490d7d" gracePeriod=30 Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.228194 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.357750 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-combined-ca-bundle\") pod \"34ac959b-554e-4193-b92b-606d335f19e9\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.357893 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d6c7\" (UniqueName: \"kubernetes.io/projected/34ac959b-554e-4193-b92b-606d335f19e9-kube-api-access-4d6c7\") pod \"34ac959b-554e-4193-b92b-606d335f19e9\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.358005 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-public-tls-certs\") pod \"34ac959b-554e-4193-b92b-606d335f19e9\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.358077 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-internal-tls-certs\") pod \"34ac959b-554e-4193-b92b-606d335f19e9\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.358120 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34ac959b-554e-4193-b92b-606d335f19e9-logs\") pod \"34ac959b-554e-4193-b92b-606d335f19e9\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.358160 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-config-data\") pod \"34ac959b-554e-4193-b92b-606d335f19e9\" (UID: \"34ac959b-554e-4193-b92b-606d335f19e9\") " Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.358576 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34ac959b-554e-4193-b92b-606d335f19e9-logs" (OuterVolumeSpecName: "logs") pod "34ac959b-554e-4193-b92b-606d335f19e9" (UID: "34ac959b-554e-4193-b92b-606d335f19e9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.369793 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34ac959b-554e-4193-b92b-606d335f19e9-kube-api-access-4d6c7" (OuterVolumeSpecName: "kube-api-access-4d6c7") pod "34ac959b-554e-4193-b92b-606d335f19e9" (UID: "34ac959b-554e-4193-b92b-606d335f19e9"). InnerVolumeSpecName "kube-api-access-4d6c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.388967 4867 generic.go:334] "Generic (PLEG): container finished" podID="80eeedcc-8486-4395-a708-2fb4b89945d4" containerID="7da8d55b80bee370279a8cb0cab58eac6870e3802a895d79a05978f6112fb54d" exitCode=143 Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.389046 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80eeedcc-8486-4395-a708-2fb4b89945d4","Type":"ContainerDied","Data":"7da8d55b80bee370279a8cb0cab58eac6870e3802a895d79a05978f6112fb54d"} Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.392127 4867 generic.go:334] "Generic (PLEG): container finished" podID="34ac959b-554e-4193-b92b-606d335f19e9" containerID="fe7a3c3176525480b61ca9edbd489dfd5f63cc84f5e78765ce8a9a018d3f9f4e" exitCode=0 Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.392150 4867 generic.go:334] "Generic (PLEG): container finished" podID="34ac959b-554e-4193-b92b-606d335f19e9" containerID="aa434dc4f32e273be0a9b0e9292d4f8081db24c650020bf23cd0326c59595c72" exitCode=143 Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.392181 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"34ac959b-554e-4193-b92b-606d335f19e9","Type":"ContainerDied","Data":"fe7a3c3176525480b61ca9edbd489dfd5f63cc84f5e78765ce8a9a018d3f9f4e"} Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.392199 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"34ac959b-554e-4193-b92b-606d335f19e9","Type":"ContainerDied","Data":"aa434dc4f32e273be0a9b0e9292d4f8081db24c650020bf23cd0326c59595c72"} Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.392212 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"34ac959b-554e-4193-b92b-606d335f19e9","Type":"ContainerDied","Data":"b4c127bd2b01ecb76a686cfe38b2ed6b749ff3b9b6facb6f998ab1ab3da71556"} Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.392231 4867 scope.go:117] "RemoveContainer" containerID="fe7a3c3176525480b61ca9edbd489dfd5f63cc84f5e78765ce8a9a018d3f9f4e" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.392465 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.400679 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34ac959b-554e-4193-b92b-606d335f19e9" (UID: "34ac959b-554e-4193-b92b-606d335f19e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.407725 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-config-data" (OuterVolumeSpecName: "config-data") pod "34ac959b-554e-4193-b92b-606d335f19e9" (UID: "34ac959b-554e-4193-b92b-606d335f19e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.425550 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "34ac959b-554e-4193-b92b-606d335f19e9" (UID: "34ac959b-554e-4193-b92b-606d335f19e9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.439996 4867 scope.go:117] "RemoveContainer" containerID="aa434dc4f32e273be0a9b0e9292d4f8081db24c650020bf23cd0326c59595c72" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.444495 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "34ac959b-554e-4193-b92b-606d335f19e9" (UID: "34ac959b-554e-4193-b92b-606d335f19e9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.462717 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.465761 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.465802 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d6c7\" (UniqueName: \"kubernetes.io/projected/34ac959b-554e-4193-b92b-606d335f19e9-kube-api-access-4d6c7\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.465829 4867 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.465846 4867 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ac959b-554e-4193-b92b-606d335f19e9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.465859 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34ac959b-554e-4193-b92b-606d335f19e9-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.475680 4867 scope.go:117] "RemoveContainer" containerID="fe7a3c3176525480b61ca9edbd489dfd5f63cc84f5e78765ce8a9a018d3f9f4e" Dec 01 09:32:24 crc kubenswrapper[4867]: E1201 09:32:24.476436 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe7a3c3176525480b61ca9edbd489dfd5f63cc84f5e78765ce8a9a018d3f9f4e\": container with ID starting with fe7a3c3176525480b61ca9edbd489dfd5f63cc84f5e78765ce8a9a018d3f9f4e not found: ID does not exist" containerID="fe7a3c3176525480b61ca9edbd489dfd5f63cc84f5e78765ce8a9a018d3f9f4e" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.476555 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe7a3c3176525480b61ca9edbd489dfd5f63cc84f5e78765ce8a9a018d3f9f4e"} err="failed to get container status \"fe7a3c3176525480b61ca9edbd489dfd5f63cc84f5e78765ce8a9a018d3f9f4e\": rpc error: code = NotFound desc = could not find container \"fe7a3c3176525480b61ca9edbd489dfd5f63cc84f5e78765ce8a9a018d3f9f4e\": container with ID starting with fe7a3c3176525480b61ca9edbd489dfd5f63cc84f5e78765ce8a9a018d3f9f4e not found: ID does not exist" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.476651 4867 scope.go:117] "RemoveContainer" containerID="aa434dc4f32e273be0a9b0e9292d4f8081db24c650020bf23cd0326c59595c72" Dec 01 09:32:24 crc kubenswrapper[4867]: E1201 09:32:24.482715 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa434dc4f32e273be0a9b0e9292d4f8081db24c650020bf23cd0326c59595c72\": container with ID starting with aa434dc4f32e273be0a9b0e9292d4f8081db24c650020bf23cd0326c59595c72 not found: ID does not exist" containerID="aa434dc4f32e273be0a9b0e9292d4f8081db24c650020bf23cd0326c59595c72" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.482856 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa434dc4f32e273be0a9b0e9292d4f8081db24c650020bf23cd0326c59595c72"} err="failed to get container status \"aa434dc4f32e273be0a9b0e9292d4f8081db24c650020bf23cd0326c59595c72\": rpc error: code = NotFound desc = could not find container \"aa434dc4f32e273be0a9b0e9292d4f8081db24c650020bf23cd0326c59595c72\": container with ID starting with aa434dc4f32e273be0a9b0e9292d4f8081db24c650020bf23cd0326c59595c72 not found: ID does not exist" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.482970 4867 scope.go:117] "RemoveContainer" containerID="fe7a3c3176525480b61ca9edbd489dfd5f63cc84f5e78765ce8a9a018d3f9f4e" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.486022 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe7a3c3176525480b61ca9edbd489dfd5f63cc84f5e78765ce8a9a018d3f9f4e"} err="failed to get container status \"fe7a3c3176525480b61ca9edbd489dfd5f63cc84f5e78765ce8a9a018d3f9f4e\": rpc error: code = NotFound desc = could not find container \"fe7a3c3176525480b61ca9edbd489dfd5f63cc84f5e78765ce8a9a018d3f9f4e\": container with ID starting with fe7a3c3176525480b61ca9edbd489dfd5f63cc84f5e78765ce8a9a018d3f9f4e not found: ID does not exist" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.486069 4867 scope.go:117] "RemoveContainer" containerID="aa434dc4f32e273be0a9b0e9292d4f8081db24c650020bf23cd0326c59595c72" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.486497 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa434dc4f32e273be0a9b0e9292d4f8081db24c650020bf23cd0326c59595c72"} err="failed to get container status \"aa434dc4f32e273be0a9b0e9292d4f8081db24c650020bf23cd0326c59595c72\": rpc error: code = NotFound desc = could not find container \"aa434dc4f32e273be0a9b0e9292d4f8081db24c650020bf23cd0326c59595c72\": container with ID starting with aa434dc4f32e273be0a9b0e9292d4f8081db24c650020bf23cd0326c59595c72 not found: ID does not exist" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.742200 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.754037 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.766518 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 09:32:24 crc kubenswrapper[4867]: E1201 09:32:24.766901 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ac959b-554e-4193-b92b-606d335f19e9" containerName="nova-api-api" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.766917 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ac959b-554e-4193-b92b-606d335f19e9" containerName="nova-api-api" Dec 01 09:32:24 crc kubenswrapper[4867]: E1201 09:32:24.766926 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.766931 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" Dec 01 09:32:24 crc kubenswrapper[4867]: E1201 09:32:24.766946 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f49a1bf-d1bd-4027-879c-e5d8b6081396" containerName="nova-manage" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.766952 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f49a1bf-d1bd-4027-879c-e5d8b6081396" containerName="nova-manage" Dec 01 09:32:24 crc kubenswrapper[4867]: E1201 09:32:24.766974 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce614908-558e-45de-ac61-f095149fee19" containerName="init" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.766979 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce614908-558e-45de-ac61-f095149fee19" containerName="init" Dec 01 09:32:24 crc kubenswrapper[4867]: E1201 09:32:24.767012 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.767018 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" Dec 01 09:32:24 crc kubenswrapper[4867]: E1201 09:32:24.767033 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce614908-558e-45de-ac61-f095149fee19" containerName="dnsmasq-dns" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.767039 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce614908-558e-45de-ac61-f095149fee19" containerName="dnsmasq-dns" Dec 01 09:32:24 crc kubenswrapper[4867]: E1201 09:32:24.767048 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.767054 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" Dec 01 09:32:24 crc kubenswrapper[4867]: E1201 09:32:24.767061 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon-log" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.767066 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon-log" Dec 01 09:32:24 crc kubenswrapper[4867]: E1201 09:32:24.767076 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ac959b-554e-4193-b92b-606d335f19e9" containerName="nova-api-log" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.767081 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ac959b-554e-4193-b92b-606d335f19e9" containerName="nova-api-log" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.767245 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f49a1bf-d1bd-4027-879c-e5d8b6081396" containerName="nova-manage" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.767258 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.767269 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="34ac959b-554e-4193-b92b-606d335f19e9" containerName="nova-api-log" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.767280 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="34ac959b-554e-4193-b92b-606d335f19e9" containerName="nova-api-api" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.767287 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce614908-558e-45de-ac61-f095149fee19" containerName="dnsmasq-dns" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.767294 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon-log" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.767300 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.767310 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ec81b7-2197-4dfb-8865-9414f0cdfc6e" containerName="horizon" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.768575 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.771310 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.772501 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.772709 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.792577 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.845491 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34ac959b-554e-4193-b92b-606d335f19e9" path="/var/lib/kubelet/pods/34ac959b-554e-4193-b92b-606d335f19e9/volumes" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.873635 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7gft\" (UniqueName: \"kubernetes.io/projected/29c7ce91-10c4-45b8-ba1c-db503ab7d5a7-kube-api-access-s7gft\") pod \"nova-api-0\" (UID: \"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7\") " pod="openstack/nova-api-0" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.874326 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29c7ce91-10c4-45b8-ba1c-db503ab7d5a7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7\") " pod="openstack/nova-api-0" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.874803 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c7ce91-10c4-45b8-ba1c-db503ab7d5a7-config-data\") pod \"nova-api-0\" (UID: \"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7\") " pod="openstack/nova-api-0" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.874840 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29c7ce91-10c4-45b8-ba1c-db503ab7d5a7-public-tls-certs\") pod \"nova-api-0\" (UID: \"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7\") " pod="openstack/nova-api-0" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.874893 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29c7ce91-10c4-45b8-ba1c-db503ab7d5a7-logs\") pod \"nova-api-0\" (UID: \"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7\") " pod="openstack/nova-api-0" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.874930 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c7ce91-10c4-45b8-ba1c-db503ab7d5a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7\") " pod="openstack/nova-api-0" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.976920 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c7ce91-10c4-45b8-ba1c-db503ab7d5a7-config-data\") pod \"nova-api-0\" (UID: \"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7\") " pod="openstack/nova-api-0" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.976980 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29c7ce91-10c4-45b8-ba1c-db503ab7d5a7-public-tls-certs\") pod \"nova-api-0\" (UID: \"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7\") " pod="openstack/nova-api-0" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.977012 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29c7ce91-10c4-45b8-ba1c-db503ab7d5a7-logs\") pod \"nova-api-0\" (UID: \"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7\") " pod="openstack/nova-api-0" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.977048 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c7ce91-10c4-45b8-ba1c-db503ab7d5a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7\") " pod="openstack/nova-api-0" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.977112 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7gft\" (UniqueName: \"kubernetes.io/projected/29c7ce91-10c4-45b8-ba1c-db503ab7d5a7-kube-api-access-s7gft\") pod \"nova-api-0\" (UID: \"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7\") " pod="openstack/nova-api-0" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.977201 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29c7ce91-10c4-45b8-ba1c-db503ab7d5a7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7\") " pod="openstack/nova-api-0" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.977955 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29c7ce91-10c4-45b8-ba1c-db503ab7d5a7-logs\") pod \"nova-api-0\" (UID: \"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7\") " pod="openstack/nova-api-0" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.984132 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c7ce91-10c4-45b8-ba1c-db503ab7d5a7-config-data\") pod \"nova-api-0\" (UID: \"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7\") " pod="openstack/nova-api-0" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.990604 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29c7ce91-10c4-45b8-ba1c-db503ab7d5a7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7\") " pod="openstack/nova-api-0" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.994659 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29c7ce91-10c4-45b8-ba1c-db503ab7d5a7-public-tls-certs\") pod \"nova-api-0\" (UID: \"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7\") " pod="openstack/nova-api-0" Dec 01 09:32:24 crc kubenswrapper[4867]: I1201 09:32:24.998928 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7gft\" (UniqueName: \"kubernetes.io/projected/29c7ce91-10c4-45b8-ba1c-db503ab7d5a7-kube-api-access-s7gft\") pod \"nova-api-0\" (UID: \"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7\") " pod="openstack/nova-api-0" Dec 01 09:32:25 crc kubenswrapper[4867]: I1201 09:32:25.003880 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c7ce91-10c4-45b8-ba1c-db503ab7d5a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7\") " pod="openstack/nova-api-0" Dec 01 09:32:25 crc kubenswrapper[4867]: I1201 09:32:25.091416 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:32:25 crc kubenswrapper[4867]: I1201 09:32:25.571727 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:32:25 crc kubenswrapper[4867]: W1201 09:32:25.572711 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29c7ce91_10c4_45b8_ba1c_db503ab7d5a7.slice/crio-57e60c213f9f106525e1ca4865cda28c6f37b51a1ff0ff3b35a84fcd7d8e2635 WatchSource:0}: Error finding container 57e60c213f9f106525e1ca4865cda28c6f37b51a1ff0ff3b35a84fcd7d8e2635: Status 404 returned error can't find the container with id 57e60c213f9f106525e1ca4865cda28c6f37b51a1ff0ff3b35a84fcd7d8e2635 Dec 01 09:32:26 crc kubenswrapper[4867]: E1201 09:32:26.163369 4867 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f9ebf71a00eb771b9a691bcf7a0545833fbf43c7e8e0cb2fd5613bb48f3ed990" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 09:32:26 crc kubenswrapper[4867]: E1201 09:32:26.165317 4867 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f9ebf71a00eb771b9a691bcf7a0545833fbf43c7e8e0cb2fd5613bb48f3ed990" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 09:32:26 crc kubenswrapper[4867]: E1201 09:32:26.166853 4867 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f9ebf71a00eb771b9a691bcf7a0545833fbf43c7e8e0cb2fd5613bb48f3ed990" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 09:32:26 crc kubenswrapper[4867]: E1201 09:32:26.166936 4867 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="3c1d2a06-41e8-483e-8fe6-ce90496010bc" containerName="nova-scheduler-scheduler" Dec 01 09:32:26 crc kubenswrapper[4867]: I1201 09:32:26.412449 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7","Type":"ContainerStarted","Data":"3aa3635dbb05c86f3e04a46802e67012491a312b0659a33737a8e573d4d6e2a1"} Dec 01 09:32:26 crc kubenswrapper[4867]: I1201 09:32:26.412646 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7","Type":"ContainerStarted","Data":"969cbfff5c8d432df5e80bdc55e6bd14ab4f634193c6e3f2518fbd553b605b32"} Dec 01 09:32:26 crc kubenswrapper[4867]: I1201 09:32:26.412737 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29c7ce91-10c4-45b8-ba1c-db503ab7d5a7","Type":"ContainerStarted","Data":"57e60c213f9f106525e1ca4865cda28c6f37b51a1ff0ff3b35a84fcd7d8e2635"} Dec 01 09:32:26 crc kubenswrapper[4867]: I1201 09:32:26.445211 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.445188061 podStartE2EDuration="2.445188061s" podCreationTimestamp="2025-12-01 09:32:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:32:26.438457986 +0000 UTC m=+1467.897844730" watchObservedRunningTime="2025-12-01 09:32:26.445188061 +0000 UTC m=+1467.904574815" Dec 01 09:32:26 crc kubenswrapper[4867]: I1201 09:32:26.963032 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.122660 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1d2a06-41e8-483e-8fe6-ce90496010bc-combined-ca-bundle\") pod \"3c1d2a06-41e8-483e-8fe6-ce90496010bc\" (UID: \"3c1d2a06-41e8-483e-8fe6-ce90496010bc\") " Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.123064 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtslm\" (UniqueName: \"kubernetes.io/projected/3c1d2a06-41e8-483e-8fe6-ce90496010bc-kube-api-access-dtslm\") pod \"3c1d2a06-41e8-483e-8fe6-ce90496010bc\" (UID: \"3c1d2a06-41e8-483e-8fe6-ce90496010bc\") " Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.123264 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c1d2a06-41e8-483e-8fe6-ce90496010bc-config-data\") pod \"3c1d2a06-41e8-483e-8fe6-ce90496010bc\" (UID: \"3c1d2a06-41e8-483e-8fe6-ce90496010bc\") " Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.128409 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c1d2a06-41e8-483e-8fe6-ce90496010bc-kube-api-access-dtslm" (OuterVolumeSpecName: "kube-api-access-dtslm") pod "3c1d2a06-41e8-483e-8fe6-ce90496010bc" (UID: "3c1d2a06-41e8-483e-8fe6-ce90496010bc"). InnerVolumeSpecName "kube-api-access-dtslm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.156365 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c1d2a06-41e8-483e-8fe6-ce90496010bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c1d2a06-41e8-483e-8fe6-ce90496010bc" (UID: "3c1d2a06-41e8-483e-8fe6-ce90496010bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.163558 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c1d2a06-41e8-483e-8fe6-ce90496010bc-config-data" (OuterVolumeSpecName: "config-data") pod "3c1d2a06-41e8-483e-8fe6-ce90496010bc" (UID: "3c1d2a06-41e8-483e-8fe6-ce90496010bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.225251 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1d2a06-41e8-483e-8fe6-ce90496010bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.225288 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtslm\" (UniqueName: \"kubernetes.io/projected/3c1d2a06-41e8-483e-8fe6-ce90496010bc-kube-api-access-dtslm\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.225301 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c1d2a06-41e8-483e-8fe6-ce90496010bc-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.280932 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="80eeedcc-8486-4395-a708-2fb4b89945d4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": dial tcp 10.217.0.194:8775: connect: connection refused" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.281194 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="80eeedcc-8486-4395-a708-2fb4b89945d4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": dial tcp 10.217.0.194:8775: connect: connection refused" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.441159 4867 generic.go:334] "Generic (PLEG): container finished" podID="80eeedcc-8486-4395-a708-2fb4b89945d4" containerID="6a3e8514ac7be518d49721c25fd01ea6ac55ff925d070872e28f3899e8490d7d" exitCode=0 Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.441235 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80eeedcc-8486-4395-a708-2fb4b89945d4","Type":"ContainerDied","Data":"6a3e8514ac7be518d49721c25fd01ea6ac55ff925d070872e28f3899e8490d7d"} Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.447038 4867 generic.go:334] "Generic (PLEG): container finished" podID="3c1d2a06-41e8-483e-8fe6-ce90496010bc" containerID="f9ebf71a00eb771b9a691bcf7a0545833fbf43c7e8e0cb2fd5613bb48f3ed990" exitCode=0 Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.447141 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.447862 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3c1d2a06-41e8-483e-8fe6-ce90496010bc","Type":"ContainerDied","Data":"f9ebf71a00eb771b9a691bcf7a0545833fbf43c7e8e0cb2fd5613bb48f3ed990"} Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.447898 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3c1d2a06-41e8-483e-8fe6-ce90496010bc","Type":"ContainerDied","Data":"59472ee870adda222e3fc892490bc4056f5a1bb46ab2b923e47ffe8740082444"} Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.447920 4867 scope.go:117] "RemoveContainer" containerID="f9ebf71a00eb771b9a691bcf7a0545833fbf43c7e8e0cb2fd5613bb48f3ed990" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.475503 4867 scope.go:117] "RemoveContainer" containerID="f9ebf71a00eb771b9a691bcf7a0545833fbf43c7e8e0cb2fd5613bb48f3ed990" Dec 01 09:32:27 crc kubenswrapper[4867]: E1201 09:32:27.476340 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9ebf71a00eb771b9a691bcf7a0545833fbf43c7e8e0cb2fd5613bb48f3ed990\": container with ID starting with f9ebf71a00eb771b9a691bcf7a0545833fbf43c7e8e0cb2fd5613bb48f3ed990 not found: ID does not exist" containerID="f9ebf71a00eb771b9a691bcf7a0545833fbf43c7e8e0cb2fd5613bb48f3ed990" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.476383 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9ebf71a00eb771b9a691bcf7a0545833fbf43c7e8e0cb2fd5613bb48f3ed990"} err="failed to get container status \"f9ebf71a00eb771b9a691bcf7a0545833fbf43c7e8e0cb2fd5613bb48f3ed990\": rpc error: code = NotFound desc = could not find container \"f9ebf71a00eb771b9a691bcf7a0545833fbf43c7e8e0cb2fd5613bb48f3ed990\": container with ID starting with f9ebf71a00eb771b9a691bcf7a0545833fbf43c7e8e0cb2fd5613bb48f3ed990 not found: ID does not exist" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.511698 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.535755 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.550739 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:32:27 crc kubenswrapper[4867]: E1201 09:32:27.551830 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c1d2a06-41e8-483e-8fe6-ce90496010bc" containerName="nova-scheduler-scheduler" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.551864 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c1d2a06-41e8-483e-8fe6-ce90496010bc" containerName="nova-scheduler-scheduler" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.552637 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c1d2a06-41e8-483e-8fe6-ce90496010bc" containerName="nova-scheduler-scheduler" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.555230 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.561354 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.569246 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.636693 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cflxh\" (UniqueName: \"kubernetes.io/projected/d0ac7269-f887-4d9f-a582-6726a5be70f7-kube-api-access-cflxh\") pod \"nova-scheduler-0\" (UID: \"d0ac7269-f887-4d9f-a582-6726a5be70f7\") " pod="openstack/nova-scheduler-0" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.636769 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0ac7269-f887-4d9f-a582-6726a5be70f7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0ac7269-f887-4d9f-a582-6726a5be70f7\") " pod="openstack/nova-scheduler-0" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.636821 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0ac7269-f887-4d9f-a582-6726a5be70f7-config-data\") pod \"nova-scheduler-0\" (UID: \"d0ac7269-f887-4d9f-a582-6726a5be70f7\") " pod="openstack/nova-scheduler-0" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.637008 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.737681 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80eeedcc-8486-4395-a708-2fb4b89945d4-combined-ca-bundle\") pod \"80eeedcc-8486-4395-a708-2fb4b89945d4\" (UID: \"80eeedcc-8486-4395-a708-2fb4b89945d4\") " Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.738272 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6rxw\" (UniqueName: \"kubernetes.io/projected/80eeedcc-8486-4395-a708-2fb4b89945d4-kube-api-access-q6rxw\") pod \"80eeedcc-8486-4395-a708-2fb4b89945d4\" (UID: \"80eeedcc-8486-4395-a708-2fb4b89945d4\") " Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.738431 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80eeedcc-8486-4395-a708-2fb4b89945d4-nova-metadata-tls-certs\") pod \"80eeedcc-8486-4395-a708-2fb4b89945d4\" (UID: \"80eeedcc-8486-4395-a708-2fb4b89945d4\") " Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.738520 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80eeedcc-8486-4395-a708-2fb4b89945d4-config-data\") pod \"80eeedcc-8486-4395-a708-2fb4b89945d4\" (UID: \"80eeedcc-8486-4395-a708-2fb4b89945d4\") " Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.738545 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80eeedcc-8486-4395-a708-2fb4b89945d4-logs\") pod \"80eeedcc-8486-4395-a708-2fb4b89945d4\" (UID: \"80eeedcc-8486-4395-a708-2fb4b89945d4\") " Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.738979 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cflxh\" (UniqueName: \"kubernetes.io/projected/d0ac7269-f887-4d9f-a582-6726a5be70f7-kube-api-access-cflxh\") pod \"nova-scheduler-0\" (UID: \"d0ac7269-f887-4d9f-a582-6726a5be70f7\") " pod="openstack/nova-scheduler-0" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.739042 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0ac7269-f887-4d9f-a582-6726a5be70f7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0ac7269-f887-4d9f-a582-6726a5be70f7\") " pod="openstack/nova-scheduler-0" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.739070 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0ac7269-f887-4d9f-a582-6726a5be70f7-config-data\") pod \"nova-scheduler-0\" (UID: \"d0ac7269-f887-4d9f-a582-6726a5be70f7\") " pod="openstack/nova-scheduler-0" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.742334 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80eeedcc-8486-4395-a708-2fb4b89945d4-logs" (OuterVolumeSpecName: "logs") pod "80eeedcc-8486-4395-a708-2fb4b89945d4" (UID: "80eeedcc-8486-4395-a708-2fb4b89945d4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.744558 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80eeedcc-8486-4395-a708-2fb4b89945d4-kube-api-access-q6rxw" (OuterVolumeSpecName: "kube-api-access-q6rxw") pod "80eeedcc-8486-4395-a708-2fb4b89945d4" (UID: "80eeedcc-8486-4395-a708-2fb4b89945d4"). InnerVolumeSpecName "kube-api-access-q6rxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.745798 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0ac7269-f887-4d9f-a582-6726a5be70f7-config-data\") pod \"nova-scheduler-0\" (UID: \"d0ac7269-f887-4d9f-a582-6726a5be70f7\") " pod="openstack/nova-scheduler-0" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.748699 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0ac7269-f887-4d9f-a582-6726a5be70f7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0ac7269-f887-4d9f-a582-6726a5be70f7\") " pod="openstack/nova-scheduler-0" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.765799 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cflxh\" (UniqueName: \"kubernetes.io/projected/d0ac7269-f887-4d9f-a582-6726a5be70f7-kube-api-access-cflxh\") pod \"nova-scheduler-0\" (UID: \"d0ac7269-f887-4d9f-a582-6726a5be70f7\") " pod="openstack/nova-scheduler-0" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.773475 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80eeedcc-8486-4395-a708-2fb4b89945d4-config-data" (OuterVolumeSpecName: "config-data") pod "80eeedcc-8486-4395-a708-2fb4b89945d4" (UID: "80eeedcc-8486-4395-a708-2fb4b89945d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.786210 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80eeedcc-8486-4395-a708-2fb4b89945d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80eeedcc-8486-4395-a708-2fb4b89945d4" (UID: "80eeedcc-8486-4395-a708-2fb4b89945d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.797277 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80eeedcc-8486-4395-a708-2fb4b89945d4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "80eeedcc-8486-4395-a708-2fb4b89945d4" (UID: "80eeedcc-8486-4395-a708-2fb4b89945d4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.839916 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80eeedcc-8486-4395-a708-2fb4b89945d4-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.840228 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80eeedcc-8486-4395-a708-2fb4b89945d4-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.840240 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80eeedcc-8486-4395-a708-2fb4b89945d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.840253 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6rxw\" (UniqueName: \"kubernetes.io/projected/80eeedcc-8486-4395-a708-2fb4b89945d4-kube-api-access-q6rxw\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.840264 4867 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80eeedcc-8486-4395-a708-2fb4b89945d4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:27 crc kubenswrapper[4867]: I1201 09:32:27.935356 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.409978 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:32:28 crc kubenswrapper[4867]: W1201 09:32:28.414621 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0ac7269_f887_4d9f_a582_6726a5be70f7.slice/crio-1cafa0b30412015f1d2459912dd267731b16511462e0690664fdb2dc01798394 WatchSource:0}: Error finding container 1cafa0b30412015f1d2459912dd267731b16511462e0690664fdb2dc01798394: Status 404 returned error can't find the container with id 1cafa0b30412015f1d2459912dd267731b16511462e0690664fdb2dc01798394 Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.457109 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0ac7269-f887-4d9f-a582-6726a5be70f7","Type":"ContainerStarted","Data":"1cafa0b30412015f1d2459912dd267731b16511462e0690664fdb2dc01798394"} Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.459205 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80eeedcc-8486-4395-a708-2fb4b89945d4","Type":"ContainerDied","Data":"3dde498faff44d72cae817655d563c7bc28869a7d7522cc84119ae382c0b827b"} Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.459219 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.459244 4867 scope.go:117] "RemoveContainer" containerID="6a3e8514ac7be518d49721c25fd01ea6ac55ff925d070872e28f3899e8490d7d" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.536228 4867 scope.go:117] "RemoveContainer" containerID="7da8d55b80bee370279a8cb0cab58eac6870e3802a895d79a05978f6112fb54d" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.575954 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.596967 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.614436 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:32:28 crc kubenswrapper[4867]: E1201 09:32:28.614821 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80eeedcc-8486-4395-a708-2fb4b89945d4" containerName="nova-metadata-log" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.614839 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="80eeedcc-8486-4395-a708-2fb4b89945d4" containerName="nova-metadata-log" Dec 01 09:32:28 crc kubenswrapper[4867]: E1201 09:32:28.614934 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80eeedcc-8486-4395-a708-2fb4b89945d4" containerName="nova-metadata-metadata" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.614942 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="80eeedcc-8486-4395-a708-2fb4b89945d4" containerName="nova-metadata-metadata" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.615138 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="80eeedcc-8486-4395-a708-2fb4b89945d4" containerName="nova-metadata-log" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.615160 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="80eeedcc-8486-4395-a708-2fb4b89945d4" containerName="nova-metadata-metadata" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.616092 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.619913 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.620474 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.623956 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.761734 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f8e90f5-24d7-406e-a2aa-d44b9e6bac71-logs\") pod \"nova-metadata-0\" (UID: \"2f8e90f5-24d7-406e-a2aa-d44b9e6bac71\") " pod="openstack/nova-metadata-0" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.761885 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f8e90f5-24d7-406e-a2aa-d44b9e6bac71-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2f8e90f5-24d7-406e-a2aa-d44b9e6bac71\") " pod="openstack/nova-metadata-0" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.761936 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pzxm\" (UniqueName: \"kubernetes.io/projected/2f8e90f5-24d7-406e-a2aa-d44b9e6bac71-kube-api-access-8pzxm\") pod \"nova-metadata-0\" (UID: \"2f8e90f5-24d7-406e-a2aa-d44b9e6bac71\") " pod="openstack/nova-metadata-0" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.761958 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8e90f5-24d7-406e-a2aa-d44b9e6bac71-config-data\") pod \"nova-metadata-0\" (UID: \"2f8e90f5-24d7-406e-a2aa-d44b9e6bac71\") " pod="openstack/nova-metadata-0" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.762089 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8e90f5-24d7-406e-a2aa-d44b9e6bac71-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2f8e90f5-24d7-406e-a2aa-d44b9e6bac71\") " pod="openstack/nova-metadata-0" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.838941 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c1d2a06-41e8-483e-8fe6-ce90496010bc" path="/var/lib/kubelet/pods/3c1d2a06-41e8-483e-8fe6-ce90496010bc/volumes" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.840153 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80eeedcc-8486-4395-a708-2fb4b89945d4" path="/var/lib/kubelet/pods/80eeedcc-8486-4395-a708-2fb4b89945d4/volumes" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.862926 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8e90f5-24d7-406e-a2aa-d44b9e6bac71-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2f8e90f5-24d7-406e-a2aa-d44b9e6bac71\") " pod="openstack/nova-metadata-0" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.862988 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f8e90f5-24d7-406e-a2aa-d44b9e6bac71-logs\") pod \"nova-metadata-0\" (UID: \"2f8e90f5-24d7-406e-a2aa-d44b9e6bac71\") " pod="openstack/nova-metadata-0" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.863052 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f8e90f5-24d7-406e-a2aa-d44b9e6bac71-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2f8e90f5-24d7-406e-a2aa-d44b9e6bac71\") " pod="openstack/nova-metadata-0" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.863072 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pzxm\" (UniqueName: \"kubernetes.io/projected/2f8e90f5-24d7-406e-a2aa-d44b9e6bac71-kube-api-access-8pzxm\") pod \"nova-metadata-0\" (UID: \"2f8e90f5-24d7-406e-a2aa-d44b9e6bac71\") " pod="openstack/nova-metadata-0" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.863090 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8e90f5-24d7-406e-a2aa-d44b9e6bac71-config-data\") pod \"nova-metadata-0\" (UID: \"2f8e90f5-24d7-406e-a2aa-d44b9e6bac71\") " pod="openstack/nova-metadata-0" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.864388 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f8e90f5-24d7-406e-a2aa-d44b9e6bac71-logs\") pod \"nova-metadata-0\" (UID: \"2f8e90f5-24d7-406e-a2aa-d44b9e6bac71\") " pod="openstack/nova-metadata-0" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.867319 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8e90f5-24d7-406e-a2aa-d44b9e6bac71-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2f8e90f5-24d7-406e-a2aa-d44b9e6bac71\") " pod="openstack/nova-metadata-0" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.867593 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8e90f5-24d7-406e-a2aa-d44b9e6bac71-config-data\") pod \"nova-metadata-0\" (UID: \"2f8e90f5-24d7-406e-a2aa-d44b9e6bac71\") " pod="openstack/nova-metadata-0" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.867941 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f8e90f5-24d7-406e-a2aa-d44b9e6bac71-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2f8e90f5-24d7-406e-a2aa-d44b9e6bac71\") " pod="openstack/nova-metadata-0" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.886409 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pzxm\" (UniqueName: \"kubernetes.io/projected/2f8e90f5-24d7-406e-a2aa-d44b9e6bac71-kube-api-access-8pzxm\") pod \"nova-metadata-0\" (UID: \"2f8e90f5-24d7-406e-a2aa-d44b9e6bac71\") " pod="openstack/nova-metadata-0" Dec 01 09:32:28 crc kubenswrapper[4867]: I1201 09:32:28.941554 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:32:29 crc kubenswrapper[4867]: I1201 09:32:29.378380 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:32:29 crc kubenswrapper[4867]: W1201 09:32:29.379980 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f8e90f5_24d7_406e_a2aa_d44b9e6bac71.slice/crio-95a75a573d310482cf62473b916024b8e20a3f4230ed8b3db7708d5c35f4413d WatchSource:0}: Error finding container 95a75a573d310482cf62473b916024b8e20a3f4230ed8b3db7708d5c35f4413d: Status 404 returned error can't find the container with id 95a75a573d310482cf62473b916024b8e20a3f4230ed8b3db7708d5c35f4413d Dec 01 09:32:29 crc kubenswrapper[4867]: I1201 09:32:29.472578 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f8e90f5-24d7-406e-a2aa-d44b9e6bac71","Type":"ContainerStarted","Data":"95a75a573d310482cf62473b916024b8e20a3f4230ed8b3db7708d5c35f4413d"} Dec 01 09:32:29 crc kubenswrapper[4867]: I1201 09:32:29.474647 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0ac7269-f887-4d9f-a582-6726a5be70f7","Type":"ContainerStarted","Data":"036a889f2801585d7dd0fa5a52cb07bd27fb9490b5a55c83de50c87d7e6c4789"} Dec 01 09:32:29 crc kubenswrapper[4867]: I1201 09:32:29.495444 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.495422753 podStartE2EDuration="2.495422753s" podCreationTimestamp="2025-12-01 09:32:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:32:29.492246345 +0000 UTC m=+1470.951633099" watchObservedRunningTime="2025-12-01 09:32:29.495422753 +0000 UTC m=+1470.954809507" Dec 01 09:32:30 crc kubenswrapper[4867]: I1201 09:32:30.496391 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f8e90f5-24d7-406e-a2aa-d44b9e6bac71","Type":"ContainerStarted","Data":"2f0b48f5166d3791050bb17143bc91a4a52e9c544ca376d2d14fc039e3f329bb"} Dec 01 09:32:30 crc kubenswrapper[4867]: I1201 09:32:30.496680 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f8e90f5-24d7-406e-a2aa-d44b9e6bac71","Type":"ContainerStarted","Data":"d1f3dcb2ac0be79f1f2f921305ea47c87e1db96328ce6fa4fbf514483a963d98"} Dec 01 09:32:32 crc kubenswrapper[4867]: I1201 09:32:32.936235 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 09:32:33 crc kubenswrapper[4867]: I1201 09:32:33.941648 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 09:32:33 crc kubenswrapper[4867]: I1201 09:32:33.941968 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 09:32:35 crc kubenswrapper[4867]: I1201 09:32:35.092402 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 09:32:35 crc kubenswrapper[4867]: I1201 09:32:35.092460 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 09:32:36 crc kubenswrapper[4867]: I1201 09:32:36.105191 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="29c7ce91-10c4-45b8-ba1c-db503ab7d5a7" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 09:32:36 crc kubenswrapper[4867]: I1201 09:32:36.105226 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="29c7ce91-10c4-45b8-ba1c-db503ab7d5a7" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 09:32:37 crc kubenswrapper[4867]: I1201 09:32:37.936501 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 09:32:37 crc kubenswrapper[4867]: I1201 09:32:37.970624 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 09:32:37 crc kubenswrapper[4867]: I1201 09:32:37.991167 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=9.991149782 podStartE2EDuration="9.991149782s" podCreationTimestamp="2025-12-01 09:32:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:32:30.525645539 +0000 UTC m=+1471.985032303" watchObservedRunningTime="2025-12-01 09:32:37.991149782 +0000 UTC m=+1479.450536536" Dec 01 09:32:38 crc kubenswrapper[4867]: I1201 09:32:38.599543 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 09:32:38 crc kubenswrapper[4867]: I1201 09:32:38.942478 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 09:32:38 crc kubenswrapper[4867]: I1201 09:32:38.942514 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 09:32:39 crc kubenswrapper[4867]: I1201 09:32:39.954966 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2f8e90f5-24d7-406e-a2aa-d44b9e6bac71" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 09:32:39 crc kubenswrapper[4867]: I1201 09:32:39.955268 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2f8e90f5-24d7-406e-a2aa-d44b9e6bac71" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 09:32:42 crc kubenswrapper[4867]: I1201 09:32:42.564968 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 09:32:45 crc kubenswrapper[4867]: I1201 09:32:45.101309 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 09:32:45 crc kubenswrapper[4867]: I1201 09:32:45.102674 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 09:32:45 crc kubenswrapper[4867]: I1201 09:32:45.103232 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 09:32:45 crc kubenswrapper[4867]: I1201 09:32:45.103281 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 09:32:45 crc kubenswrapper[4867]: I1201 09:32:45.112084 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 09:32:45 crc kubenswrapper[4867]: I1201 09:32:45.113988 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 09:32:48 crc kubenswrapper[4867]: I1201 09:32:48.950313 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 09:32:48 crc kubenswrapper[4867]: I1201 09:32:48.953124 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 09:32:48 crc kubenswrapper[4867]: I1201 09:32:48.959704 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 09:32:49 crc kubenswrapper[4867]: I1201 09:32:49.675419 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 09:32:51 crc kubenswrapper[4867]: I1201 09:32:51.601615 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:32:51 crc kubenswrapper[4867]: I1201 09:32:51.602002 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:32:58 crc kubenswrapper[4867]: I1201 09:32:58.872588 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 09:33:00 crc kubenswrapper[4867]: I1201 09:33:00.037398 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 09:33:04 crc kubenswrapper[4867]: I1201 09:33:04.887698 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="63bff526-5063-4326-8b3c-0c580320be58" containerName="rabbitmq" containerID="cri-o://16c98f7f1a0f460a30b2b6c3277e0b19bae4407f075f2fdee48433b2196b74b5" gracePeriod=604794 Dec 01 09:33:05 crc kubenswrapper[4867]: I1201 09:33:05.157128 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="7f260d89-a8a0-4d49-a34a-a36a06ef2eee" containerName="rabbitmq" containerID="cri-o://5571874b5306501064f0c8bfadaba2a8c63eee6455e2deb24d2e4b2a403ca008" gracePeriod=604795 Dec 01 09:33:07 crc kubenswrapper[4867]: I1201 09:33:07.720532 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="63bff526-5063-4326-8b3c-0c580320be58" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Dec 01 09:33:08 crc kubenswrapper[4867]: I1201 09:33:08.131930 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="7f260d89-a8a0-4d49-a34a-a36a06ef2eee" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.774569 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.834482 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.909840 4867 generic.go:334] "Generic (PLEG): container finished" podID="7f260d89-a8a0-4d49-a34a-a36a06ef2eee" containerID="5571874b5306501064f0c8bfadaba2a8c63eee6455e2deb24d2e4b2a403ca008" exitCode=0 Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.909904 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7f260d89-a8a0-4d49-a34a-a36a06ef2eee","Type":"ContainerDied","Data":"5571874b5306501064f0c8bfadaba2a8c63eee6455e2deb24d2e4b2a403ca008"} Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.909931 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7f260d89-a8a0-4d49-a34a-a36a06ef2eee","Type":"ContainerDied","Data":"5599e1d50b5891175e084c006f1a4c0fe84add0d38a187320464286d0ae49fb0"} Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.909946 4867 scope.go:117] "RemoveContainer" containerID="5571874b5306501064f0c8bfadaba2a8c63eee6455e2deb24d2e4b2a403ca008" Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.910060 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.916735 4867 generic.go:334] "Generic (PLEG): container finished" podID="63bff526-5063-4326-8b3c-0c580320be58" containerID="16c98f7f1a0f460a30b2b6c3277e0b19bae4407f075f2fdee48433b2196b74b5" exitCode=0 Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.916772 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63bff526-5063-4326-8b3c-0c580320be58","Type":"ContainerDied","Data":"16c98f7f1a0f460a30b2b6c3277e0b19bae4407f075f2fdee48433b2196b74b5"} Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.916795 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63bff526-5063-4326-8b3c-0c580320be58","Type":"ContainerDied","Data":"d071e4ff6b3e798f26750823e7e4e403d7551d813defb7d6a22c2dc336d1d4da"} Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.916882 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.933399 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-pod-info\") pod \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.933475 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-tls\") pod \"63bff526-5063-4326-8b3c-0c580320be58\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.933518 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-erlang-cookie\") pod \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.933554 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vvxs\" (UniqueName: \"kubernetes.io/projected/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-kube-api-access-7vvxs\") pod \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.933578 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-confd\") pod \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.933603 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63bff526-5063-4326-8b3c-0c580320be58-config-data\") pod \"63bff526-5063-4326-8b3c-0c580320be58\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.933637 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"63bff526-5063-4326-8b3c-0c580320be58\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.933723 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-plugins\") pod \"63bff526-5063-4326-8b3c-0c580320be58\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.933750 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-server-conf\") pod \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.933765 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-tls\") pod \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.933790 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-erlang-cookie\") pod \"63bff526-5063-4326-8b3c-0c580320be58\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.933829 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-plugins\") pod \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.933866 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-plugins-conf\") pod \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.933889 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/63bff526-5063-4326-8b3c-0c580320be58-plugins-conf\") pod \"63bff526-5063-4326-8b3c-0c580320be58\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.933911 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/63bff526-5063-4326-8b3c-0c580320be58-pod-info\") pod \"63bff526-5063-4326-8b3c-0c580320be58\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.933928 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.933959 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2kwf\" (UniqueName: \"kubernetes.io/projected/63bff526-5063-4326-8b3c-0c580320be58-kube-api-access-h2kwf\") pod \"63bff526-5063-4326-8b3c-0c580320be58\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.933991 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-erlang-cookie-secret\") pod \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.934011 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-confd\") pod \"63bff526-5063-4326-8b3c-0c580320be58\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.934042 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/63bff526-5063-4326-8b3c-0c580320be58-erlang-cookie-secret\") pod \"63bff526-5063-4326-8b3c-0c580320be58\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.934059 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-config-data\") pod \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\" (UID: \"7f260d89-a8a0-4d49-a34a-a36a06ef2eee\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.934092 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/63bff526-5063-4326-8b3c-0c580320be58-server-conf\") pod \"63bff526-5063-4326-8b3c-0c580320be58\" (UID: \"63bff526-5063-4326-8b3c-0c580320be58\") " Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.967377 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "7f260d89-a8a0-4d49-a34a-a36a06ef2eee" (UID: "7f260d89-a8a0-4d49-a34a-a36a06ef2eee"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.977010 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "63bff526-5063-4326-8b3c-0c580320be58" (UID: "63bff526-5063-4326-8b3c-0c580320be58"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.979947 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "63bff526-5063-4326-8b3c-0c580320be58" (UID: "63bff526-5063-4326-8b3c-0c580320be58"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.980422 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "7f260d89-a8a0-4d49-a34a-a36a06ef2eee" (UID: "7f260d89-a8a0-4d49-a34a-a36a06ef2eee"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.980447 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "7f260d89-a8a0-4d49-a34a-a36a06ef2eee" (UID: "7f260d89-a8a0-4d49-a34a-a36a06ef2eee"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.996779 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63bff526-5063-4326-8b3c-0c580320be58-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "63bff526-5063-4326-8b3c-0c580320be58" (UID: "63bff526-5063-4326-8b3c-0c580320be58"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.996979 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-pod-info" (OuterVolumeSpecName: "pod-info") pod "7f260d89-a8a0-4d49-a34a-a36a06ef2eee" (UID: "7f260d89-a8a0-4d49-a34a-a36a06ef2eee"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.997293 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63bff526-5063-4326-8b3c-0c580320be58-kube-api-access-h2kwf" (OuterVolumeSpecName: "kube-api-access-h2kwf") pod "63bff526-5063-4326-8b3c-0c580320be58" (UID: "63bff526-5063-4326-8b3c-0c580320be58"). InnerVolumeSpecName "kube-api-access-h2kwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.999117 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63bff526-5063-4326-8b3c-0c580320be58-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "63bff526-5063-4326-8b3c-0c580320be58" (UID: "63bff526-5063-4326-8b3c-0c580320be58"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:11 crc kubenswrapper[4867]: I1201 09:33:11.999796 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "7f260d89-a8a0-4d49-a34a-a36a06ef2eee" (UID: "7f260d89-a8a0-4d49-a34a-a36a06ef2eee"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.008225 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/63bff526-5063-4326-8b3c-0c580320be58-pod-info" (OuterVolumeSpecName: "pod-info") pod "63bff526-5063-4326-8b3c-0c580320be58" (UID: "63bff526-5063-4326-8b3c-0c580320be58"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.031012 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-kube-api-access-7vvxs" (OuterVolumeSpecName: "kube-api-access-7vvxs") pod "7f260d89-a8a0-4d49-a34a-a36a06ef2eee" (UID: "7f260d89-a8a0-4d49-a34a-a36a06ef2eee"). InnerVolumeSpecName "kube-api-access-7vvxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.031217 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "7f260d89-a8a0-4d49-a34a-a36a06ef2eee" (UID: "7f260d89-a8a0-4d49-a34a-a36a06ef2eee"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.034923 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "63bff526-5063-4326-8b3c-0c580320be58" (UID: "63bff526-5063-4326-8b3c-0c580320be58"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.036504 4867 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.036526 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vvxs\" (UniqueName: \"kubernetes.io/projected/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-kube-api-access-7vvxs\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.036569 4867 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.036581 4867 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.036595 4867 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.036603 4867 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.036632 4867 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.036644 4867 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/63bff526-5063-4326-8b3c-0c580320be58-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.036653 4867 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/63bff526-5063-4326-8b3c-0c580320be58-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.036650 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "63bff526-5063-4326-8b3c-0c580320be58" (UID: "63bff526-5063-4326-8b3c-0c580320be58"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.036667 4867 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.036719 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2kwf\" (UniqueName: \"kubernetes.io/projected/63bff526-5063-4326-8b3c-0c580320be58-kube-api-access-h2kwf\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.036730 4867 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.036739 4867 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/63bff526-5063-4326-8b3c-0c580320be58-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.036750 4867 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.044520 4867 scope.go:117] "RemoveContainer" containerID="b266880f24fcd0138fc752fed1bbccea596621df22b09f8f05090abd3f8acacc" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.045275 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "7f260d89-a8a0-4d49-a34a-a36a06ef2eee" (UID: "7f260d89-a8a0-4d49-a34a-a36a06ef2eee"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.081387 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63bff526-5063-4326-8b3c-0c580320be58-config-data" (OuterVolumeSpecName: "config-data") pod "63bff526-5063-4326-8b3c-0c580320be58" (UID: "63bff526-5063-4326-8b3c-0c580320be58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.085915 4867 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.100722 4867 scope.go:117] "RemoveContainer" containerID="5571874b5306501064f0c8bfadaba2a8c63eee6455e2deb24d2e4b2a403ca008" Dec 01 09:33:12 crc kubenswrapper[4867]: E1201 09:33:12.105003 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5571874b5306501064f0c8bfadaba2a8c63eee6455e2deb24d2e4b2a403ca008\": container with ID starting with 5571874b5306501064f0c8bfadaba2a8c63eee6455e2deb24d2e4b2a403ca008 not found: ID does not exist" containerID="5571874b5306501064f0c8bfadaba2a8c63eee6455e2deb24d2e4b2a403ca008" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.105063 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5571874b5306501064f0c8bfadaba2a8c63eee6455e2deb24d2e4b2a403ca008"} err="failed to get container status \"5571874b5306501064f0c8bfadaba2a8c63eee6455e2deb24d2e4b2a403ca008\": rpc error: code = NotFound desc = could not find container \"5571874b5306501064f0c8bfadaba2a8c63eee6455e2deb24d2e4b2a403ca008\": container with ID starting with 5571874b5306501064f0c8bfadaba2a8c63eee6455e2deb24d2e4b2a403ca008 not found: ID does not exist" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.105093 4867 scope.go:117] "RemoveContainer" containerID="b266880f24fcd0138fc752fed1bbccea596621df22b09f8f05090abd3f8acacc" Dec 01 09:33:12 crc kubenswrapper[4867]: E1201 09:33:12.108254 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b266880f24fcd0138fc752fed1bbccea596621df22b09f8f05090abd3f8acacc\": container with ID starting with b266880f24fcd0138fc752fed1bbccea596621df22b09f8f05090abd3f8acacc not found: ID does not exist" containerID="b266880f24fcd0138fc752fed1bbccea596621df22b09f8f05090abd3f8acacc" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.108293 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b266880f24fcd0138fc752fed1bbccea596621df22b09f8f05090abd3f8acacc"} err="failed to get container status \"b266880f24fcd0138fc752fed1bbccea596621df22b09f8f05090abd3f8acacc\": rpc error: code = NotFound desc = could not find container \"b266880f24fcd0138fc752fed1bbccea596621df22b09f8f05090abd3f8acacc\": container with ID starting with b266880f24fcd0138fc752fed1bbccea596621df22b09f8f05090abd3f8acacc not found: ID does not exist" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.108321 4867 scope.go:117] "RemoveContainer" containerID="16c98f7f1a0f460a30b2b6c3277e0b19bae4407f075f2fdee48433b2196b74b5" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.108857 4867 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.119649 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63bff526-5063-4326-8b3c-0c580320be58-server-conf" (OuterVolumeSpecName: "server-conf") pod "63bff526-5063-4326-8b3c-0c580320be58" (UID: "63bff526-5063-4326-8b3c-0c580320be58"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.138893 4867 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.138920 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63bff526-5063-4326-8b3c-0c580320be58-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.138929 4867 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.138940 4867 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.138948 4867 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.138956 4867 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/63bff526-5063-4326-8b3c-0c580320be58-server-conf\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.141167 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-config-data" (OuterVolumeSpecName: "config-data") pod "7f260d89-a8a0-4d49-a34a-a36a06ef2eee" (UID: "7f260d89-a8a0-4d49-a34a-a36a06ef2eee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.159137 4867 scope.go:117] "RemoveContainer" containerID="bc1fcb2e09f9f9eb3b3ed5dd075ca24185a23aff2a86d37d58acb4a60948ed1f" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.160474 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-server-conf" (OuterVolumeSpecName: "server-conf") pod "7f260d89-a8a0-4d49-a34a-a36a06ef2eee" (UID: "7f260d89-a8a0-4d49-a34a-a36a06ef2eee"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.195951 4867 scope.go:117] "RemoveContainer" containerID="16c98f7f1a0f460a30b2b6c3277e0b19bae4407f075f2fdee48433b2196b74b5" Dec 01 09:33:12 crc kubenswrapper[4867]: E1201 09:33:12.197231 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16c98f7f1a0f460a30b2b6c3277e0b19bae4407f075f2fdee48433b2196b74b5\": container with ID starting with 16c98f7f1a0f460a30b2b6c3277e0b19bae4407f075f2fdee48433b2196b74b5 not found: ID does not exist" containerID="16c98f7f1a0f460a30b2b6c3277e0b19bae4407f075f2fdee48433b2196b74b5" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.197279 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16c98f7f1a0f460a30b2b6c3277e0b19bae4407f075f2fdee48433b2196b74b5"} err="failed to get container status \"16c98f7f1a0f460a30b2b6c3277e0b19bae4407f075f2fdee48433b2196b74b5\": rpc error: code = NotFound desc = could not find container \"16c98f7f1a0f460a30b2b6c3277e0b19bae4407f075f2fdee48433b2196b74b5\": container with ID starting with 16c98f7f1a0f460a30b2b6c3277e0b19bae4407f075f2fdee48433b2196b74b5 not found: ID does not exist" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.197310 4867 scope.go:117] "RemoveContainer" containerID="bc1fcb2e09f9f9eb3b3ed5dd075ca24185a23aff2a86d37d58acb4a60948ed1f" Dec 01 09:33:12 crc kubenswrapper[4867]: E1201 09:33:12.197643 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc1fcb2e09f9f9eb3b3ed5dd075ca24185a23aff2a86d37d58acb4a60948ed1f\": container with ID starting with bc1fcb2e09f9f9eb3b3ed5dd075ca24185a23aff2a86d37d58acb4a60948ed1f not found: ID does not exist" containerID="bc1fcb2e09f9f9eb3b3ed5dd075ca24185a23aff2a86d37d58acb4a60948ed1f" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.197672 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc1fcb2e09f9f9eb3b3ed5dd075ca24185a23aff2a86d37d58acb4a60948ed1f"} err="failed to get container status \"bc1fcb2e09f9f9eb3b3ed5dd075ca24185a23aff2a86d37d58acb4a60948ed1f\": rpc error: code = NotFound desc = could not find container \"bc1fcb2e09f9f9eb3b3ed5dd075ca24185a23aff2a86d37d58acb4a60948ed1f\": container with ID starting with bc1fcb2e09f9f9eb3b3ed5dd075ca24185a23aff2a86d37d58acb4a60948ed1f not found: ID does not exist" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.210884 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "7f260d89-a8a0-4d49-a34a-a36a06ef2eee" (UID: "7f260d89-a8a0-4d49-a34a-a36a06ef2eee"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.240672 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.240698 4867 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.240709 4867 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7f260d89-a8a0-4d49-a34a-a36a06ef2eee-server-conf\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.260039 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "63bff526-5063-4326-8b3c-0c580320be58" (UID: "63bff526-5063-4326-8b3c-0c580320be58"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.342456 4867 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/63bff526-5063-4326-8b3c-0c580320be58-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.589881 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.612894 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.637874 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.650013 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.664873 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 09:33:12 crc kubenswrapper[4867]: E1201 09:33:12.665574 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f260d89-a8a0-4d49-a34a-a36a06ef2eee" containerName="setup-container" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.665603 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f260d89-a8a0-4d49-a34a-a36a06ef2eee" containerName="setup-container" Dec 01 09:33:12 crc kubenswrapper[4867]: E1201 09:33:12.665621 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f260d89-a8a0-4d49-a34a-a36a06ef2eee" containerName="rabbitmq" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.665630 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f260d89-a8a0-4d49-a34a-a36a06ef2eee" containerName="rabbitmq" Dec 01 09:33:12 crc kubenswrapper[4867]: E1201 09:33:12.665651 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63bff526-5063-4326-8b3c-0c580320be58" containerName="setup-container" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.665661 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="63bff526-5063-4326-8b3c-0c580320be58" containerName="setup-container" Dec 01 09:33:12 crc kubenswrapper[4867]: E1201 09:33:12.665678 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63bff526-5063-4326-8b3c-0c580320be58" containerName="rabbitmq" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.665685 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="63bff526-5063-4326-8b3c-0c580320be58" containerName="rabbitmq" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.665945 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f260d89-a8a0-4d49-a34a-a36a06ef2eee" containerName="rabbitmq" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.665971 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="63bff526-5063-4326-8b3c-0c580320be58" containerName="rabbitmq" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.667125 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.687164 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.688777 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.690781 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vjhz7" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.691057 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.691103 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.691265 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.691385 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.691496 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.691533 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.694188 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.694347 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.694524 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.694631 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.694755 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-s6htk" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.702217 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.702620 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.719752 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.750209 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.751047 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.751120 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj5hg\" (UniqueName: \"kubernetes.io/projected/e3936faf-3dae-4db5-8851-10c1ebe7673b-kube-api-access-pj5hg\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.751144 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3936faf-3dae-4db5-8851-10c1ebe7673b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.751183 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3936faf-3dae-4db5-8851-10c1ebe7673b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.751211 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3936faf-3dae-4db5-8851-10c1ebe7673b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.751404 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3936faf-3dae-4db5-8851-10c1ebe7673b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.751446 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3936faf-3dae-4db5-8851-10c1ebe7673b-config-data\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.751486 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3936faf-3dae-4db5-8851-10c1ebe7673b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.751555 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3936faf-3dae-4db5-8851-10c1ebe7673b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.751582 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3936faf-3dae-4db5-8851-10c1ebe7673b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.751630 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3936faf-3dae-4db5-8851-10c1ebe7673b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.850107 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63bff526-5063-4326-8b3c-0c580320be58" path="/var/lib/kubelet/pods/63bff526-5063-4326-8b3c-0c580320be58/volumes" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.851221 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f260d89-a8a0-4d49-a34a-a36a06ef2eee" path="/var/lib/kubelet/pods/7f260d89-a8a0-4d49-a34a-a36a06ef2eee/volumes" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.853540 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1a327b42-8b19-491b-a9ba-2c11f0227183-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.853584 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.853611 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1a327b42-8b19-491b-a9ba-2c11f0227183-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.853633 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1a327b42-8b19-491b-a9ba-2c11f0227183-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.853661 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3936faf-3dae-4db5-8851-10c1ebe7673b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.853676 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj5hg\" (UniqueName: \"kubernetes.io/projected/e3936faf-3dae-4db5-8851-10c1ebe7673b-kube-api-access-pj5hg\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.853702 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3936faf-3dae-4db5-8851-10c1ebe7673b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.853719 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3936faf-3dae-4db5-8851-10c1ebe7673b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.853747 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1a327b42-8b19-491b-a9ba-2c11f0227183-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.853779 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3936faf-3dae-4db5-8851-10c1ebe7673b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.853795 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3936faf-3dae-4db5-8851-10c1ebe7673b-config-data\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.853829 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1a327b42-8b19-491b-a9ba-2c11f0227183-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.853848 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1a327b42-8b19-491b-a9ba-2c11f0227183-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.853870 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3936faf-3dae-4db5-8851-10c1ebe7673b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.853898 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3936faf-3dae-4db5-8851-10c1ebe7673b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.853915 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3936faf-3dae-4db5-8851-10c1ebe7673b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.853937 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3936faf-3dae-4db5-8851-10c1ebe7673b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.853964 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1a327b42-8b19-491b-a9ba-2c11f0227183-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.853992 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1a327b42-8b19-491b-a9ba-2c11f0227183-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.854010 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.854026 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a327b42-8b19-491b-a9ba-2c11f0227183-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.854056 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkkwl\" (UniqueName: \"kubernetes.io/projected/1a327b42-8b19-491b-a9ba-2c11f0227183-kube-api-access-zkkwl\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.854441 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.855181 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3936faf-3dae-4db5-8851-10c1ebe7673b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.856239 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3936faf-3dae-4db5-8851-10c1ebe7673b-config-data\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.856398 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3936faf-3dae-4db5-8851-10c1ebe7673b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.856735 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3936faf-3dae-4db5-8851-10c1ebe7673b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.857026 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3936faf-3dae-4db5-8851-10c1ebe7673b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.861230 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3936faf-3dae-4db5-8851-10c1ebe7673b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.870109 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3936faf-3dae-4db5-8851-10c1ebe7673b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.874875 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3936faf-3dae-4db5-8851-10c1ebe7673b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.875120 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3936faf-3dae-4db5-8851-10c1ebe7673b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.875401 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj5hg\" (UniqueName: \"kubernetes.io/projected/e3936faf-3dae-4db5-8851-10c1ebe7673b-kube-api-access-pj5hg\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.925428 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"e3936faf-3dae-4db5-8851-10c1ebe7673b\") " pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.955607 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1a327b42-8b19-491b-a9ba-2c11f0227183-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.955683 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1a327b42-8b19-491b-a9ba-2c11f0227183-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.955700 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1a327b42-8b19-491b-a9ba-2c11f0227183-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.955790 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1a327b42-8b19-491b-a9ba-2c11f0227183-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.955909 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1a327b42-8b19-491b-a9ba-2c11f0227183-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.955931 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.955948 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a327b42-8b19-491b-a9ba-2c11f0227183-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.955998 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkkwl\" (UniqueName: \"kubernetes.io/projected/1a327b42-8b19-491b-a9ba-2c11f0227183-kube-api-access-zkkwl\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.956034 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1a327b42-8b19-491b-a9ba-2c11f0227183-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.956084 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1a327b42-8b19-491b-a9ba-2c11f0227183-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.956100 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1a327b42-8b19-491b-a9ba-2c11f0227183-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.956925 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.958284 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1a327b42-8b19-491b-a9ba-2c11f0227183-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.958649 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a327b42-8b19-491b-a9ba-2c11f0227183-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.959646 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1a327b42-8b19-491b-a9ba-2c11f0227183-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.960849 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1a327b42-8b19-491b-a9ba-2c11f0227183-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.961588 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1a327b42-8b19-491b-a9ba-2c11f0227183-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.963369 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1a327b42-8b19-491b-a9ba-2c11f0227183-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.968103 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1a327b42-8b19-491b-a9ba-2c11f0227183-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.989664 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.991399 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1a327b42-8b19-491b-a9ba-2c11f0227183-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.991554 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1a327b42-8b19-491b-a9ba-2c11f0227183-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:12 crc kubenswrapper[4867]: I1201 09:33:12.994876 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkkwl\" (UniqueName: \"kubernetes.io/projected/1a327b42-8b19-491b-a9ba-2c11f0227183-kube-api-access-zkkwl\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:13 crc kubenswrapper[4867]: I1201 09:33:13.020160 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1a327b42-8b19-491b-a9ba-2c11f0227183\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:13 crc kubenswrapper[4867]: I1201 09:33:13.313702 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:13 crc kubenswrapper[4867]: I1201 09:33:13.514230 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 09:33:13 crc kubenswrapper[4867]: I1201 09:33:13.846826 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 09:33:13 crc kubenswrapper[4867]: W1201 09:33:13.850129 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a327b42_8b19_491b_a9ba_2c11f0227183.slice/crio-279617f4779d0d5ab8ecdc9e337a7970a44b4984482654e64efee07cd1c6e48f WatchSource:0}: Error finding container 279617f4779d0d5ab8ecdc9e337a7970a44b4984482654e64efee07cd1c6e48f: Status 404 returned error can't find the container with id 279617f4779d0d5ab8ecdc9e337a7970a44b4984482654e64efee07cd1c6e48f Dec 01 09:33:13 crc kubenswrapper[4867]: I1201 09:33:13.941897 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1a327b42-8b19-491b-a9ba-2c11f0227183","Type":"ContainerStarted","Data":"279617f4779d0d5ab8ecdc9e337a7970a44b4984482654e64efee07cd1c6e48f"} Dec 01 09:33:13 crc kubenswrapper[4867]: I1201 09:33:13.945293 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e3936faf-3dae-4db5-8851-10c1ebe7673b","Type":"ContainerStarted","Data":"00f21ae855a624ed4fa0ff832b162001828769a6ce72e9c03c50905957f1a6d8"} Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.435848 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-mnwlk"] Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.440111 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.444828 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.459365 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-mnwlk"] Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.497153 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.497252 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-config\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.497312 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.497342 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.497368 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.497403 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h99g2\" (UniqueName: \"kubernetes.io/projected/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-kube-api-access-h99g2\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.497448 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.598867 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.598934 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.598981 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.599028 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h99g2\" (UniqueName: \"kubernetes.io/projected/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-kube-api-access-h99g2\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.599079 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.599145 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.599214 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-config\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.599997 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.600424 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.600455 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.600701 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-config\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.600723 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.601191 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.616357 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-mnwlk"] Dec 01 09:33:14 crc kubenswrapper[4867]: E1201 09:33:14.617327 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-h99g2], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" podUID="cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.628077 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h99g2\" (UniqueName: \"kubernetes.io/projected/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-kube-api-access-h99g2\") pod \"dnsmasq-dns-79bd4cc8c9-mnwlk\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.683525 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cd9bffc9-gzkbb"] Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.685466 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.699874 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cd9bffc9-gzkbb"] Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.802577 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.802636 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwj9x\" (UniqueName: \"kubernetes.io/projected/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-kube-api-access-pwj9x\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.802687 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-openstack-edpm-ipam\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.802742 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-dns-swift-storage-0\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.802770 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.802840 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-config\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.802929 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-dns-svc\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.905109 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-dns-svc\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.905224 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.905255 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwj9x\" (UniqueName: \"kubernetes.io/projected/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-kube-api-access-pwj9x\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.905295 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-openstack-edpm-ipam\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.905338 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-dns-swift-storage-0\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.905360 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.905396 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-config\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.906105 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-dns-svc\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.906525 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.906756 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.907439 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-config\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.907478 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-dns-swift-storage-0\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.907679 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-openstack-edpm-ipam\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.933686 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwj9x\" (UniqueName: \"kubernetes.io/projected/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-kube-api-access-pwj9x\") pod \"dnsmasq-dns-6cd9bffc9-gzkbb\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.953545 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:14 crc kubenswrapper[4867]: I1201 09:33:14.973273 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.006489 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-dns-swift-storage-0\") pod \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.006624 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-config\") pod \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.006645 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-ovsdbserver-sb\") pod \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.007029 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66" (UID: "cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.007092 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-config" (OuterVolumeSpecName: "config") pod "cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66" (UID: "cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.007164 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66" (UID: "cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.007205 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-ovsdbserver-nb\") pod \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.007231 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-dns-svc\") pod \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.007477 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66" (UID: "cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.007534 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66" (UID: "cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.007673 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h99g2\" (UniqueName: \"kubernetes.io/projected/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-kube-api-access-h99g2\") pod \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.008182 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-openstack-edpm-ipam\") pod \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\" (UID: \"cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66\") " Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.008584 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66" (UID: "cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.009042 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.009061 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.009070 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.009082 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.009091 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.009099 4867 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.016720 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.102560 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-kube-api-access-h99g2" (OuterVolumeSpecName: "kube-api-access-h99g2") pod "cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66" (UID: "cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66"). InnerVolumeSpecName "kube-api-access-h99g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.112752 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h99g2\" (UniqueName: \"kubernetes.io/projected/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66-kube-api-access-h99g2\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.713084 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cd9bffc9-gzkbb"] Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.969478 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e3936faf-3dae-4db5-8851-10c1ebe7673b","Type":"ContainerStarted","Data":"f41f637e6f4aaa5bb9255d11dc3437e8c4eb2582446a818929e6dfb2b1c94b26"} Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.972002 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1a327b42-8b19-491b-a9ba-2c11f0227183","Type":"ContainerStarted","Data":"db7e591fa757c1945aa3a211892b8653e895690373a3b9609f4ffe1d6237c62d"} Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.975313 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" event={"ID":"bfcc3de8-d914-4608-966c-3d9a2dc2b11d","Type":"ContainerStarted","Data":"899a0d56deaec30168a612a673fe5aa8863465fa607bae4e2fe820e5cc6e3b71"} Dec 01 09:33:15 crc kubenswrapper[4867]: I1201 09:33:15.975337 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-mnwlk" Dec 01 09:33:16 crc kubenswrapper[4867]: I1201 09:33:16.104508 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-mnwlk"] Dec 01 09:33:16 crc kubenswrapper[4867]: I1201 09:33:16.120498 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-mnwlk"] Dec 01 09:33:16 crc kubenswrapper[4867]: I1201 09:33:16.842397 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66" path="/var/lib/kubelet/pods/cc87a97a-43b2-4fcb-9ba7-f0ba8c949c66/volumes" Dec 01 09:33:16 crc kubenswrapper[4867]: I1201 09:33:16.985929 4867 generic.go:334] "Generic (PLEG): container finished" podID="bfcc3de8-d914-4608-966c-3d9a2dc2b11d" containerID="99d1347bae0896eaf05c7869b589ab0dde6fedf80b16b270269796bbd356ccc2" exitCode=0 Dec 01 09:33:16 crc kubenswrapper[4867]: I1201 09:33:16.986081 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" event={"ID":"bfcc3de8-d914-4608-966c-3d9a2dc2b11d","Type":"ContainerDied","Data":"99d1347bae0896eaf05c7869b589ab0dde6fedf80b16b270269796bbd356ccc2"} Dec 01 09:33:17 crc kubenswrapper[4867]: I1201 09:33:17.997472 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" event={"ID":"bfcc3de8-d914-4608-966c-3d9a2dc2b11d","Type":"ContainerStarted","Data":"7597be5af2a6f56ca06194d50db5918d8357bc6cf4ef0dcefa040558e98d198d"} Dec 01 09:33:17 crc kubenswrapper[4867]: I1201 09:33:17.998147 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:18 crc kubenswrapper[4867]: I1201 09:33:18.030992 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" podStartSLOduration=4.030971318 podStartE2EDuration="4.030971318s" podCreationTimestamp="2025-12-01 09:33:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:33:18.029280582 +0000 UTC m=+1519.488667346" watchObservedRunningTime="2025-12-01 09:33:18.030971318 +0000 UTC m=+1519.490358072" Dec 01 09:33:21 crc kubenswrapper[4867]: I1201 09:33:21.601058 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:33:21 crc kubenswrapper[4867]: I1201 09:33:21.601358 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.019050 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.113960 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-r494r"] Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.114883 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" podUID="b5292749-115d-4d53-9ad5-0000a87fe88d" containerName="dnsmasq-dns" containerID="cri-o://2b607714a8af1eda235fb550bae5694a67bd3fab958d15c3fdbfd68906d2a5af" gracePeriod=10 Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.308864 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54ffdb7d8c-94nfw"] Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.317022 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.345487 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54ffdb7d8c-94nfw"] Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.356849 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-dns-svc\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.357091 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-openstack-edpm-ipam\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.357208 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-dns-swift-storage-0\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.357310 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.357435 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.357527 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhf2j\" (UniqueName: \"kubernetes.io/projected/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-kube-api-access-zhf2j\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.361526 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-config\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.463827 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhf2j\" (UniqueName: \"kubernetes.io/projected/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-kube-api-access-zhf2j\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.463917 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-config\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.464020 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-dns-svc\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.464068 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-openstack-edpm-ipam\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.464143 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-dns-swift-storage-0\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.464201 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.464270 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.465771 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.465975 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-openstack-edpm-ipam\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.466100 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-config\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.466655 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-dns-swift-storage-0\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.466763 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-dns-svc\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.467002 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.509651 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhf2j\" (UniqueName: \"kubernetes.io/projected/cfe6379b-a971-4c4b-9cba-75f2f56de0b1-kube-api-access-zhf2j\") pod \"dnsmasq-dns-54ffdb7d8c-94nfw\" (UID: \"cfe6379b-a971-4c4b-9cba-75f2f56de0b1\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.693401 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.701290 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.771334 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss6kg\" (UniqueName: \"kubernetes.io/projected/b5292749-115d-4d53-9ad5-0000a87fe88d-kube-api-access-ss6kg\") pod \"b5292749-115d-4d53-9ad5-0000a87fe88d\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.771407 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-dns-svc\") pod \"b5292749-115d-4d53-9ad5-0000a87fe88d\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.771544 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-dns-swift-storage-0\") pod \"b5292749-115d-4d53-9ad5-0000a87fe88d\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.771658 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-ovsdbserver-sb\") pod \"b5292749-115d-4d53-9ad5-0000a87fe88d\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.771715 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-config\") pod \"b5292749-115d-4d53-9ad5-0000a87fe88d\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.771769 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-ovsdbserver-nb\") pod \"b5292749-115d-4d53-9ad5-0000a87fe88d\" (UID: \"b5292749-115d-4d53-9ad5-0000a87fe88d\") " Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.780791 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5292749-115d-4d53-9ad5-0000a87fe88d-kube-api-access-ss6kg" (OuterVolumeSpecName: "kube-api-access-ss6kg") pod "b5292749-115d-4d53-9ad5-0000a87fe88d" (UID: "b5292749-115d-4d53-9ad5-0000a87fe88d"). InnerVolumeSpecName "kube-api-access-ss6kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.874497 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss6kg\" (UniqueName: \"kubernetes.io/projected/b5292749-115d-4d53-9ad5-0000a87fe88d-kube-api-access-ss6kg\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.884001 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b5292749-115d-4d53-9ad5-0000a87fe88d" (UID: "b5292749-115d-4d53-9ad5-0000a87fe88d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.900130 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b5292749-115d-4d53-9ad5-0000a87fe88d" (UID: "b5292749-115d-4d53-9ad5-0000a87fe88d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.901008 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b5292749-115d-4d53-9ad5-0000a87fe88d" (UID: "b5292749-115d-4d53-9ad5-0000a87fe88d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.907835 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b5292749-115d-4d53-9ad5-0000a87fe88d" (UID: "b5292749-115d-4d53-9ad5-0000a87fe88d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.914378 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-config" (OuterVolumeSpecName: "config") pod "b5292749-115d-4d53-9ad5-0000a87fe88d" (UID: "b5292749-115d-4d53-9ad5-0000a87fe88d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.980520 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.980611 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.980629 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.980678 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:25 crc kubenswrapper[4867]: I1201 09:33:25.980690 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5292749-115d-4d53-9ad5-0000a87fe88d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:26 crc kubenswrapper[4867]: I1201 09:33:26.079328 4867 generic.go:334] "Generic (PLEG): container finished" podID="b5292749-115d-4d53-9ad5-0000a87fe88d" containerID="2b607714a8af1eda235fb550bae5694a67bd3fab958d15c3fdbfd68906d2a5af" exitCode=0 Dec 01 09:33:26 crc kubenswrapper[4867]: I1201 09:33:26.079407 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" Dec 01 09:33:26 crc kubenswrapper[4867]: I1201 09:33:26.079412 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" event={"ID":"b5292749-115d-4d53-9ad5-0000a87fe88d","Type":"ContainerDied","Data":"2b607714a8af1eda235fb550bae5694a67bd3fab958d15c3fdbfd68906d2a5af"} Dec 01 09:33:26 crc kubenswrapper[4867]: I1201 09:33:26.079488 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" event={"ID":"b5292749-115d-4d53-9ad5-0000a87fe88d","Type":"ContainerDied","Data":"68b2a6d8c734904ecc015201324dd223688b2031b774cbf458f66ce06b9c408c"} Dec 01 09:33:26 crc kubenswrapper[4867]: I1201 09:33:26.079507 4867 scope.go:117] "RemoveContainer" containerID="2b607714a8af1eda235fb550bae5694a67bd3fab958d15c3fdbfd68906d2a5af" Dec 01 09:33:26 crc kubenswrapper[4867]: I1201 09:33:26.114748 4867 scope.go:117] "RemoveContainer" containerID="1d8f37f0f4cd6291a02aedb3f982cbda3af39f0ea24ec90d1422d2a9c67e4519" Dec 01 09:33:26 crc kubenswrapper[4867]: I1201 09:33:26.120873 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-r494r"] Dec 01 09:33:26 crc kubenswrapper[4867]: I1201 09:33:26.155103 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-r494r"] Dec 01 09:33:26 crc kubenswrapper[4867]: I1201 09:33:26.162661 4867 scope.go:117] "RemoveContainer" containerID="2b607714a8af1eda235fb550bae5694a67bd3fab958d15c3fdbfd68906d2a5af" Dec 01 09:33:26 crc kubenswrapper[4867]: E1201 09:33:26.163134 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b607714a8af1eda235fb550bae5694a67bd3fab958d15c3fdbfd68906d2a5af\": container with ID starting with 2b607714a8af1eda235fb550bae5694a67bd3fab958d15c3fdbfd68906d2a5af not found: ID does not exist" containerID="2b607714a8af1eda235fb550bae5694a67bd3fab958d15c3fdbfd68906d2a5af" Dec 01 09:33:26 crc kubenswrapper[4867]: I1201 09:33:26.163160 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b607714a8af1eda235fb550bae5694a67bd3fab958d15c3fdbfd68906d2a5af"} err="failed to get container status \"2b607714a8af1eda235fb550bae5694a67bd3fab958d15c3fdbfd68906d2a5af\": rpc error: code = NotFound desc = could not find container \"2b607714a8af1eda235fb550bae5694a67bd3fab958d15c3fdbfd68906d2a5af\": container with ID starting with 2b607714a8af1eda235fb550bae5694a67bd3fab958d15c3fdbfd68906d2a5af not found: ID does not exist" Dec 01 09:33:26 crc kubenswrapper[4867]: I1201 09:33:26.163179 4867 scope.go:117] "RemoveContainer" containerID="1d8f37f0f4cd6291a02aedb3f982cbda3af39f0ea24ec90d1422d2a9c67e4519" Dec 01 09:33:26 crc kubenswrapper[4867]: E1201 09:33:26.163581 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d8f37f0f4cd6291a02aedb3f982cbda3af39f0ea24ec90d1422d2a9c67e4519\": container with ID starting with 1d8f37f0f4cd6291a02aedb3f982cbda3af39f0ea24ec90d1422d2a9c67e4519 not found: ID does not exist" containerID="1d8f37f0f4cd6291a02aedb3f982cbda3af39f0ea24ec90d1422d2a9c67e4519" Dec 01 09:33:26 crc kubenswrapper[4867]: I1201 09:33:26.163604 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d8f37f0f4cd6291a02aedb3f982cbda3af39f0ea24ec90d1422d2a9c67e4519"} err="failed to get container status \"1d8f37f0f4cd6291a02aedb3f982cbda3af39f0ea24ec90d1422d2a9c67e4519\": rpc error: code = NotFound desc = could not find container \"1d8f37f0f4cd6291a02aedb3f982cbda3af39f0ea24ec90d1422d2a9c67e4519\": container with ID starting with 1d8f37f0f4cd6291a02aedb3f982cbda3af39f0ea24ec90d1422d2a9c67e4519 not found: ID does not exist" Dec 01 09:33:26 crc kubenswrapper[4867]: I1201 09:33:26.237182 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54ffdb7d8c-94nfw"] Dec 01 09:33:26 crc kubenswrapper[4867]: I1201 09:33:26.854298 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5292749-115d-4d53-9ad5-0000a87fe88d" path="/var/lib/kubelet/pods/b5292749-115d-4d53-9ad5-0000a87fe88d/volumes" Dec 01 09:33:27 crc kubenswrapper[4867]: I1201 09:33:27.090924 4867 generic.go:334] "Generic (PLEG): container finished" podID="cfe6379b-a971-4c4b-9cba-75f2f56de0b1" containerID="59df73707b5594f185262403d721a7b0cdbfad84ca699ec963a485c6e2c48b36" exitCode=0 Dec 01 09:33:27 crc kubenswrapper[4867]: I1201 09:33:27.090964 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" event={"ID":"cfe6379b-a971-4c4b-9cba-75f2f56de0b1","Type":"ContainerDied","Data":"59df73707b5594f185262403d721a7b0cdbfad84ca699ec963a485c6e2c48b36"} Dec 01 09:33:27 crc kubenswrapper[4867]: I1201 09:33:27.090992 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" event={"ID":"cfe6379b-a971-4c4b-9cba-75f2f56de0b1","Type":"ContainerStarted","Data":"01ad2e6442142623ef5c4ae7fd3c3619bce0fd24cde45349359b86dcf69b6fb6"} Dec 01 09:33:28 crc kubenswrapper[4867]: I1201 09:33:28.101183 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" event={"ID":"cfe6379b-a971-4c4b-9cba-75f2f56de0b1","Type":"ContainerStarted","Data":"9d937277ef64503569992248d7029fcd2c89c9e9175422092286aeb42827905c"} Dec 01 09:33:28 crc kubenswrapper[4867]: I1201 09:33:28.101534 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:28 crc kubenswrapper[4867]: I1201 09:33:28.120359 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" podStartSLOduration=3.120336133 podStartE2EDuration="3.120336133s" podCreationTimestamp="2025-12-01 09:33:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:33:28.116910699 +0000 UTC m=+1529.576297453" watchObservedRunningTime="2025-12-01 09:33:28.120336133 +0000 UTC m=+1529.579722887" Dec 01 09:33:30 crc kubenswrapper[4867]: I1201 09:33:30.662382 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-89c5cd4d5-r494r" podUID="b5292749-115d-4d53-9ad5-0000a87fe88d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.200:5353: i/o timeout" Dec 01 09:33:35 crc kubenswrapper[4867]: I1201 09:33:35.694998 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54ffdb7d8c-94nfw" Dec 01 09:33:35 crc kubenswrapper[4867]: I1201 09:33:35.752197 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cd9bffc9-gzkbb"] Dec 01 09:33:35 crc kubenswrapper[4867]: I1201 09:33:35.752652 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" podUID="bfcc3de8-d914-4608-966c-3d9a2dc2b11d" containerName="dnsmasq-dns" containerID="cri-o://7597be5af2a6f56ca06194d50db5918d8357bc6cf4ef0dcefa040558e98d198d" gracePeriod=10 Dec 01 09:33:36 crc kubenswrapper[4867]: I1201 09:33:36.171549 4867 generic.go:334] "Generic (PLEG): container finished" podID="bfcc3de8-d914-4608-966c-3d9a2dc2b11d" containerID="7597be5af2a6f56ca06194d50db5918d8357bc6cf4ef0dcefa040558e98d198d" exitCode=0 Dec 01 09:33:36 crc kubenswrapper[4867]: I1201 09:33:36.171598 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" event={"ID":"bfcc3de8-d914-4608-966c-3d9a2dc2b11d","Type":"ContainerDied","Data":"7597be5af2a6f56ca06194d50db5918d8357bc6cf4ef0dcefa040558e98d198d"} Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.001372 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.107068 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwj9x\" (UniqueName: \"kubernetes.io/projected/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-kube-api-access-pwj9x\") pod \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.107154 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-dns-svc\") pod \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.107200 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-ovsdbserver-sb\") pod \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.107276 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-ovsdbserver-nb\") pod \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.107327 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-openstack-edpm-ipam\") pod \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.107358 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-config\") pod \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.107434 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-dns-swift-storage-0\") pod \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\" (UID: \"bfcc3de8-d914-4608-966c-3d9a2dc2b11d\") " Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.113424 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-kube-api-access-pwj9x" (OuterVolumeSpecName: "kube-api-access-pwj9x") pod "bfcc3de8-d914-4608-966c-3d9a2dc2b11d" (UID: "bfcc3de8-d914-4608-966c-3d9a2dc2b11d"). InnerVolumeSpecName "kube-api-access-pwj9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.153612 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bfcc3de8-d914-4608-966c-3d9a2dc2b11d" (UID: "bfcc3de8-d914-4608-966c-3d9a2dc2b11d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.155252 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "bfcc3de8-d914-4608-966c-3d9a2dc2b11d" (UID: "bfcc3de8-d914-4608-966c-3d9a2dc2b11d"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.155339 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bfcc3de8-d914-4608-966c-3d9a2dc2b11d" (UID: "bfcc3de8-d914-4608-966c-3d9a2dc2b11d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.182944 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" event={"ID":"bfcc3de8-d914-4608-966c-3d9a2dc2b11d","Type":"ContainerDied","Data":"899a0d56deaec30168a612a673fe5aa8863465fa607bae4e2fe820e5cc6e3b71"} Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.182998 4867 scope.go:117] "RemoveContainer" containerID="7597be5af2a6f56ca06194d50db5918d8357bc6cf4ef0dcefa040558e98d198d" Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.183026 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd9bffc9-gzkbb" Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.196273 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-config" (OuterVolumeSpecName: "config") pod "bfcc3de8-d914-4608-966c-3d9a2dc2b11d" (UID: "bfcc3de8-d914-4608-966c-3d9a2dc2b11d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.198862 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bfcc3de8-d914-4608-966c-3d9a2dc2b11d" (UID: "bfcc3de8-d914-4608-966c-3d9a2dc2b11d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.209788 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.209835 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwj9x\" (UniqueName: \"kubernetes.io/projected/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-kube-api-access-pwj9x\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.209866 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.209894 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.209931 4867 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.210697 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.224228 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bfcc3de8-d914-4608-966c-3d9a2dc2b11d" (UID: "bfcc3de8-d914-4608-966c-3d9a2dc2b11d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.261219 4867 scope.go:117] "RemoveContainer" containerID="99d1347bae0896eaf05c7869b589ab0dde6fedf80b16b270269796bbd356ccc2" Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.312657 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfcc3de8-d914-4608-966c-3d9a2dc2b11d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.516784 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cd9bffc9-gzkbb"] Dec 01 09:33:37 crc kubenswrapper[4867]: I1201 09:33:37.524621 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cd9bffc9-gzkbb"] Dec 01 09:33:38 crc kubenswrapper[4867]: I1201 09:33:38.255959 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lmdjf"] Dec 01 09:33:38 crc kubenswrapper[4867]: E1201 09:33:38.256706 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5292749-115d-4d53-9ad5-0000a87fe88d" containerName="dnsmasq-dns" Dec 01 09:33:38 crc kubenswrapper[4867]: I1201 09:33:38.256720 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5292749-115d-4d53-9ad5-0000a87fe88d" containerName="dnsmasq-dns" Dec 01 09:33:38 crc kubenswrapper[4867]: E1201 09:33:38.256733 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5292749-115d-4d53-9ad5-0000a87fe88d" containerName="init" Dec 01 09:33:38 crc kubenswrapper[4867]: I1201 09:33:38.256738 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5292749-115d-4d53-9ad5-0000a87fe88d" containerName="init" Dec 01 09:33:38 crc kubenswrapper[4867]: E1201 09:33:38.256778 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfcc3de8-d914-4608-966c-3d9a2dc2b11d" containerName="dnsmasq-dns" Dec 01 09:33:38 crc kubenswrapper[4867]: I1201 09:33:38.256785 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfcc3de8-d914-4608-966c-3d9a2dc2b11d" containerName="dnsmasq-dns" Dec 01 09:33:38 crc kubenswrapper[4867]: E1201 09:33:38.256793 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfcc3de8-d914-4608-966c-3d9a2dc2b11d" containerName="init" Dec 01 09:33:38 crc kubenswrapper[4867]: I1201 09:33:38.256801 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfcc3de8-d914-4608-966c-3d9a2dc2b11d" containerName="init" Dec 01 09:33:38 crc kubenswrapper[4867]: I1201 09:33:38.257082 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfcc3de8-d914-4608-966c-3d9a2dc2b11d" containerName="dnsmasq-dns" Dec 01 09:33:38 crc kubenswrapper[4867]: I1201 09:33:38.257111 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5292749-115d-4d53-9ad5-0000a87fe88d" containerName="dnsmasq-dns" Dec 01 09:33:38 crc kubenswrapper[4867]: I1201 09:33:38.258593 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lmdjf" Dec 01 09:33:38 crc kubenswrapper[4867]: I1201 09:33:38.266001 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lmdjf"] Dec 01 09:33:38 crc kubenswrapper[4867]: I1201 09:33:38.331492 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19a5785-1bb3-47f1-a931-7c314aea2f90-catalog-content\") pod \"redhat-marketplace-lmdjf\" (UID: \"b19a5785-1bb3-47f1-a931-7c314aea2f90\") " pod="openshift-marketplace/redhat-marketplace-lmdjf" Dec 01 09:33:38 crc kubenswrapper[4867]: I1201 09:33:38.331544 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19a5785-1bb3-47f1-a931-7c314aea2f90-utilities\") pod \"redhat-marketplace-lmdjf\" (UID: \"b19a5785-1bb3-47f1-a931-7c314aea2f90\") " pod="openshift-marketplace/redhat-marketplace-lmdjf" Dec 01 09:33:38 crc kubenswrapper[4867]: I1201 09:33:38.331577 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5zp2\" (UniqueName: \"kubernetes.io/projected/b19a5785-1bb3-47f1-a931-7c314aea2f90-kube-api-access-t5zp2\") pod \"redhat-marketplace-lmdjf\" (UID: \"b19a5785-1bb3-47f1-a931-7c314aea2f90\") " pod="openshift-marketplace/redhat-marketplace-lmdjf" Dec 01 09:33:38 crc kubenswrapper[4867]: I1201 09:33:38.432860 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19a5785-1bb3-47f1-a931-7c314aea2f90-catalog-content\") pod \"redhat-marketplace-lmdjf\" (UID: \"b19a5785-1bb3-47f1-a931-7c314aea2f90\") " pod="openshift-marketplace/redhat-marketplace-lmdjf" Dec 01 09:33:38 crc kubenswrapper[4867]: I1201 09:33:38.432924 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19a5785-1bb3-47f1-a931-7c314aea2f90-utilities\") pod \"redhat-marketplace-lmdjf\" (UID: \"b19a5785-1bb3-47f1-a931-7c314aea2f90\") " pod="openshift-marketplace/redhat-marketplace-lmdjf" Dec 01 09:33:38 crc kubenswrapper[4867]: I1201 09:33:38.432973 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5zp2\" (UniqueName: \"kubernetes.io/projected/b19a5785-1bb3-47f1-a931-7c314aea2f90-kube-api-access-t5zp2\") pod \"redhat-marketplace-lmdjf\" (UID: \"b19a5785-1bb3-47f1-a931-7c314aea2f90\") " pod="openshift-marketplace/redhat-marketplace-lmdjf" Dec 01 09:33:38 crc kubenswrapper[4867]: I1201 09:33:38.433762 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19a5785-1bb3-47f1-a931-7c314aea2f90-catalog-content\") pod \"redhat-marketplace-lmdjf\" (UID: \"b19a5785-1bb3-47f1-a931-7c314aea2f90\") " pod="openshift-marketplace/redhat-marketplace-lmdjf" Dec 01 09:33:38 crc kubenswrapper[4867]: I1201 09:33:38.433816 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19a5785-1bb3-47f1-a931-7c314aea2f90-utilities\") pod \"redhat-marketplace-lmdjf\" (UID: \"b19a5785-1bb3-47f1-a931-7c314aea2f90\") " pod="openshift-marketplace/redhat-marketplace-lmdjf" Dec 01 09:33:38 crc kubenswrapper[4867]: I1201 09:33:38.451857 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5zp2\" (UniqueName: \"kubernetes.io/projected/b19a5785-1bb3-47f1-a931-7c314aea2f90-kube-api-access-t5zp2\") pod \"redhat-marketplace-lmdjf\" (UID: \"b19a5785-1bb3-47f1-a931-7c314aea2f90\") " pod="openshift-marketplace/redhat-marketplace-lmdjf" Dec 01 09:33:38 crc kubenswrapper[4867]: I1201 09:33:38.578487 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lmdjf" Dec 01 09:33:38 crc kubenswrapper[4867]: I1201 09:33:38.843629 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfcc3de8-d914-4608-966c-3d9a2dc2b11d" path="/var/lib/kubelet/pods/bfcc3de8-d914-4608-966c-3d9a2dc2b11d/volumes" Dec 01 09:33:39 crc kubenswrapper[4867]: I1201 09:33:39.193343 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lmdjf"] Dec 01 09:33:39 crc kubenswrapper[4867]: I1201 09:33:39.215796 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmdjf" event={"ID":"b19a5785-1bb3-47f1-a931-7c314aea2f90","Type":"ContainerStarted","Data":"d93272522bdccf74378a27b57056257cd6f4133a2639c088286cdaeed83e3a67"} Dec 01 09:33:40 crc kubenswrapper[4867]: I1201 09:33:40.226499 4867 generic.go:334] "Generic (PLEG): container finished" podID="b19a5785-1bb3-47f1-a931-7c314aea2f90" containerID="50ed9997f0143c3b6c9b2627515878a052e8862e43de4adcc11314aa4b53fb12" exitCode=0 Dec 01 09:33:40 crc kubenswrapper[4867]: I1201 09:33:40.226550 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmdjf" event={"ID":"b19a5785-1bb3-47f1-a931-7c314aea2f90","Type":"ContainerDied","Data":"50ed9997f0143c3b6c9b2627515878a052e8862e43de4adcc11314aa4b53fb12"} Dec 01 09:33:41 crc kubenswrapper[4867]: I1201 09:33:41.240226 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmdjf" event={"ID":"b19a5785-1bb3-47f1-a931-7c314aea2f90","Type":"ContainerStarted","Data":"3066839a1367abf4a5e34431da0a23829a29dec5d6d1cd85d87f7a8f80a886c2"} Dec 01 09:33:42 crc kubenswrapper[4867]: I1201 09:33:42.253895 4867 generic.go:334] "Generic (PLEG): container finished" podID="b19a5785-1bb3-47f1-a931-7c314aea2f90" containerID="3066839a1367abf4a5e34431da0a23829a29dec5d6d1cd85d87f7a8f80a886c2" exitCode=0 Dec 01 09:33:42 crc kubenswrapper[4867]: I1201 09:33:42.253983 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmdjf" event={"ID":"b19a5785-1bb3-47f1-a931-7c314aea2f90","Type":"ContainerDied","Data":"3066839a1367abf4a5e34431da0a23829a29dec5d6d1cd85d87f7a8f80a886c2"} Dec 01 09:33:43 crc kubenswrapper[4867]: I1201 09:33:43.266675 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmdjf" event={"ID":"b19a5785-1bb3-47f1-a931-7c314aea2f90","Type":"ContainerStarted","Data":"0c68931f5259e3cb4e4b77b689f72777024e0ed942fa987e9669f027aee95971"} Dec 01 09:33:43 crc kubenswrapper[4867]: I1201 09:33:43.289338 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lmdjf" podStartSLOduration=2.710895469 podStartE2EDuration="5.289312601s" podCreationTimestamp="2025-12-01 09:33:38 +0000 UTC" firstStartedPulling="2025-12-01 09:33:40.230255827 +0000 UTC m=+1541.689642571" lastFinishedPulling="2025-12-01 09:33:42.808672949 +0000 UTC m=+1544.268059703" observedRunningTime="2025-12-01 09:33:43.283077739 +0000 UTC m=+1544.742464513" watchObservedRunningTime="2025-12-01 09:33:43.289312601 +0000 UTC m=+1544.748699365" Dec 01 09:33:47 crc kubenswrapper[4867]: I1201 09:33:47.314113 4867 generic.go:334] "Generic (PLEG): container finished" podID="e3936faf-3dae-4db5-8851-10c1ebe7673b" containerID="f41f637e6f4aaa5bb9255d11dc3437e8c4eb2582446a818929e6dfb2b1c94b26" exitCode=0 Dec 01 09:33:47 crc kubenswrapper[4867]: I1201 09:33:47.314634 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e3936faf-3dae-4db5-8851-10c1ebe7673b","Type":"ContainerDied","Data":"f41f637e6f4aaa5bb9255d11dc3437e8c4eb2582446a818929e6dfb2b1c94b26"} Dec 01 09:33:47 crc kubenswrapper[4867]: I1201 09:33:47.320133 4867 generic.go:334] "Generic (PLEG): container finished" podID="1a327b42-8b19-491b-a9ba-2c11f0227183" containerID="db7e591fa757c1945aa3a211892b8653e895690373a3b9609f4ffe1d6237c62d" exitCode=0 Dec 01 09:33:47 crc kubenswrapper[4867]: I1201 09:33:47.320178 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1a327b42-8b19-491b-a9ba-2c11f0227183","Type":"ContainerDied","Data":"db7e591fa757c1945aa3a211892b8653e895690373a3b9609f4ffe1d6237c62d"} Dec 01 09:33:48 crc kubenswrapper[4867]: I1201 09:33:48.333440 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1a327b42-8b19-491b-a9ba-2c11f0227183","Type":"ContainerStarted","Data":"c9e52cd31411b556322e6a417829609f2b72f907ba57deb354b3be445b033300"} Dec 01 09:33:48 crc kubenswrapper[4867]: I1201 09:33:48.335272 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:33:48 crc kubenswrapper[4867]: I1201 09:33:48.338839 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e3936faf-3dae-4db5-8851-10c1ebe7673b","Type":"ContainerStarted","Data":"e2f63701a4af7d14039be0b50f5b9afbb99b22160d59a2842d7a199938d81c28"} Dec 01 09:33:48 crc kubenswrapper[4867]: I1201 09:33:48.339305 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 01 09:33:48 crc kubenswrapper[4867]: I1201 09:33:48.373279 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.373247422 podStartE2EDuration="36.373247422s" podCreationTimestamp="2025-12-01 09:33:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:33:48.355384381 +0000 UTC m=+1549.814771135" watchObservedRunningTime="2025-12-01 09:33:48.373247422 +0000 UTC m=+1549.832634196" Dec 01 09:33:48 crc kubenswrapper[4867]: I1201 09:33:48.391374 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.39134395 podStartE2EDuration="36.39134395s" podCreationTimestamp="2025-12-01 09:33:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:33:48.382484656 +0000 UTC m=+1549.841871410" watchObservedRunningTime="2025-12-01 09:33:48.39134395 +0000 UTC m=+1549.850730724" Dec 01 09:33:48 crc kubenswrapper[4867]: I1201 09:33:48.578582 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lmdjf" Dec 01 09:33:48 crc kubenswrapper[4867]: I1201 09:33:48.578859 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lmdjf" Dec 01 09:33:48 crc kubenswrapper[4867]: I1201 09:33:48.652867 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lmdjf" Dec 01 09:33:49 crc kubenswrapper[4867]: I1201 09:33:49.396796 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lmdjf" Dec 01 09:33:49 crc kubenswrapper[4867]: I1201 09:33:49.451050 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lmdjf"] Dec 01 09:33:51 crc kubenswrapper[4867]: I1201 09:33:51.364508 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lmdjf" podUID="b19a5785-1bb3-47f1-a931-7c314aea2f90" containerName="registry-server" containerID="cri-o://0c68931f5259e3cb4e4b77b689f72777024e0ed942fa987e9669f027aee95971" gracePeriod=2 Dec 01 09:33:51 crc kubenswrapper[4867]: I1201 09:33:51.600884 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:33:51 crc kubenswrapper[4867]: I1201 09:33:51.601225 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:33:51 crc kubenswrapper[4867]: I1201 09:33:51.601273 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:33:51 crc kubenswrapper[4867]: I1201 09:33:51.602046 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949"} pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:33:51 crc kubenswrapper[4867]: I1201 09:33:51.602101 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" containerID="cri-o://5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" gracePeriod=600 Dec 01 09:33:51 crc kubenswrapper[4867]: E1201 09:33:51.728772 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:33:51 crc kubenswrapper[4867]: I1201 09:33:51.975399 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lmdjf" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.061588 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5zp2\" (UniqueName: \"kubernetes.io/projected/b19a5785-1bb3-47f1-a931-7c314aea2f90-kube-api-access-t5zp2\") pod \"b19a5785-1bb3-47f1-a931-7c314aea2f90\" (UID: \"b19a5785-1bb3-47f1-a931-7c314aea2f90\") " Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.061927 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19a5785-1bb3-47f1-a931-7c314aea2f90-catalog-content\") pod \"b19a5785-1bb3-47f1-a931-7c314aea2f90\" (UID: \"b19a5785-1bb3-47f1-a931-7c314aea2f90\") " Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.062183 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19a5785-1bb3-47f1-a931-7c314aea2f90-utilities\") pod \"b19a5785-1bb3-47f1-a931-7c314aea2f90\" (UID: \"b19a5785-1bb3-47f1-a931-7c314aea2f90\") " Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.064588 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b19a5785-1bb3-47f1-a931-7c314aea2f90-utilities" (OuterVolumeSpecName: "utilities") pod "b19a5785-1bb3-47f1-a931-7c314aea2f90" (UID: "b19a5785-1bb3-47f1-a931-7c314aea2f90"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.088219 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b19a5785-1bb3-47f1-a931-7c314aea2f90-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b19a5785-1bb3-47f1-a931-7c314aea2f90" (UID: "b19a5785-1bb3-47f1-a931-7c314aea2f90"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.095296 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b19a5785-1bb3-47f1-a931-7c314aea2f90-kube-api-access-t5zp2" (OuterVolumeSpecName: "kube-api-access-t5zp2") pod "b19a5785-1bb3-47f1-a931-7c314aea2f90" (UID: "b19a5785-1bb3-47f1-a931-7c314aea2f90"). InnerVolumeSpecName "kube-api-access-t5zp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.165519 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5zp2\" (UniqueName: \"kubernetes.io/projected/b19a5785-1bb3-47f1-a931-7c314aea2f90-kube-api-access-t5zp2\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.165553 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19a5785-1bb3-47f1-a931-7c314aea2f90-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.165589 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19a5785-1bb3-47f1-a931-7c314aea2f90-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.335183 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mxjtt"] Dec 01 09:33:52 crc kubenswrapper[4867]: E1201 09:33:52.335561 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19a5785-1bb3-47f1-a931-7c314aea2f90" containerName="registry-server" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.335581 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19a5785-1bb3-47f1-a931-7c314aea2f90" containerName="registry-server" Dec 01 09:33:52 crc kubenswrapper[4867]: E1201 09:33:52.335602 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19a5785-1bb3-47f1-a931-7c314aea2f90" containerName="extract-content" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.335613 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19a5785-1bb3-47f1-a931-7c314aea2f90" containerName="extract-content" Dec 01 09:33:52 crc kubenswrapper[4867]: E1201 09:33:52.335624 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19a5785-1bb3-47f1-a931-7c314aea2f90" containerName="extract-utilities" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.335631 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19a5785-1bb3-47f1-a931-7c314aea2f90" containerName="extract-utilities" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.335876 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b19a5785-1bb3-47f1-a931-7c314aea2f90" containerName="registry-server" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.337402 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mxjtt" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.356028 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mxjtt"] Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.389573 4867 generic.go:334] "Generic (PLEG): container finished" podID="b19a5785-1bb3-47f1-a931-7c314aea2f90" containerID="0c68931f5259e3cb4e4b77b689f72777024e0ed942fa987e9669f027aee95971" exitCode=0 Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.389647 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmdjf" event={"ID":"b19a5785-1bb3-47f1-a931-7c314aea2f90","Type":"ContainerDied","Data":"0c68931f5259e3cb4e4b77b689f72777024e0ed942fa987e9669f027aee95971"} Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.389682 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmdjf" event={"ID":"b19a5785-1bb3-47f1-a931-7c314aea2f90","Type":"ContainerDied","Data":"d93272522bdccf74378a27b57056257cd6f4133a2639c088286cdaeed83e3a67"} Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.389704 4867 scope.go:117] "RemoveContainer" containerID="0c68931f5259e3cb4e4b77b689f72777024e0ed942fa987e9669f027aee95971" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.389914 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lmdjf" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.400637 4867 generic.go:334] "Generic (PLEG): container finished" podID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" exitCode=0 Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.407650 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerDied","Data":"5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949"} Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.408782 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:33:52 crc kubenswrapper[4867]: E1201 09:33:52.415837 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.443170 4867 scope.go:117] "RemoveContainer" containerID="3066839a1367abf4a5e34431da0a23829a29dec5d6d1cd85d87f7a8f80a886c2" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.471631 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c38b1010-0547-4fac-b3e6-ae1f615b6b91-utilities\") pod \"certified-operators-mxjtt\" (UID: \"c38b1010-0547-4fac-b3e6-ae1f615b6b91\") " pod="openshift-marketplace/certified-operators-mxjtt" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.471785 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnc5x\" (UniqueName: \"kubernetes.io/projected/c38b1010-0547-4fac-b3e6-ae1f615b6b91-kube-api-access-hnc5x\") pod \"certified-operators-mxjtt\" (UID: \"c38b1010-0547-4fac-b3e6-ae1f615b6b91\") " pod="openshift-marketplace/certified-operators-mxjtt" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.471863 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c38b1010-0547-4fac-b3e6-ae1f615b6b91-catalog-content\") pod \"certified-operators-mxjtt\" (UID: \"c38b1010-0547-4fac-b3e6-ae1f615b6b91\") " pod="openshift-marketplace/certified-operators-mxjtt" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.482275 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lmdjf"] Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.492026 4867 scope.go:117] "RemoveContainer" containerID="50ed9997f0143c3b6c9b2627515878a052e8862e43de4adcc11314aa4b53fb12" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.509945 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lmdjf"] Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.578806 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c38b1010-0547-4fac-b3e6-ae1f615b6b91-catalog-content\") pod \"certified-operators-mxjtt\" (UID: \"c38b1010-0547-4fac-b3e6-ae1f615b6b91\") " pod="openshift-marketplace/certified-operators-mxjtt" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.578957 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c38b1010-0547-4fac-b3e6-ae1f615b6b91-utilities\") pod \"certified-operators-mxjtt\" (UID: \"c38b1010-0547-4fac-b3e6-ae1f615b6b91\") " pod="openshift-marketplace/certified-operators-mxjtt" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.579329 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c38b1010-0547-4fac-b3e6-ae1f615b6b91-catalog-content\") pod \"certified-operators-mxjtt\" (UID: \"c38b1010-0547-4fac-b3e6-ae1f615b6b91\") " pod="openshift-marketplace/certified-operators-mxjtt" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.579333 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c38b1010-0547-4fac-b3e6-ae1f615b6b91-utilities\") pod \"certified-operators-mxjtt\" (UID: \"c38b1010-0547-4fac-b3e6-ae1f615b6b91\") " pod="openshift-marketplace/certified-operators-mxjtt" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.579548 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnc5x\" (UniqueName: \"kubernetes.io/projected/c38b1010-0547-4fac-b3e6-ae1f615b6b91-kube-api-access-hnc5x\") pod \"certified-operators-mxjtt\" (UID: \"c38b1010-0547-4fac-b3e6-ae1f615b6b91\") " pod="openshift-marketplace/certified-operators-mxjtt" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.587877 4867 scope.go:117] "RemoveContainer" containerID="0c68931f5259e3cb4e4b77b689f72777024e0ed942fa987e9669f027aee95971" Dec 01 09:33:52 crc kubenswrapper[4867]: E1201 09:33:52.589328 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c68931f5259e3cb4e4b77b689f72777024e0ed942fa987e9669f027aee95971\": container with ID starting with 0c68931f5259e3cb4e4b77b689f72777024e0ed942fa987e9669f027aee95971 not found: ID does not exist" containerID="0c68931f5259e3cb4e4b77b689f72777024e0ed942fa987e9669f027aee95971" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.589359 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c68931f5259e3cb4e4b77b689f72777024e0ed942fa987e9669f027aee95971"} err="failed to get container status \"0c68931f5259e3cb4e4b77b689f72777024e0ed942fa987e9669f027aee95971\": rpc error: code = NotFound desc = could not find container \"0c68931f5259e3cb4e4b77b689f72777024e0ed942fa987e9669f027aee95971\": container with ID starting with 0c68931f5259e3cb4e4b77b689f72777024e0ed942fa987e9669f027aee95971 not found: ID does not exist" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.589378 4867 scope.go:117] "RemoveContainer" containerID="3066839a1367abf4a5e34431da0a23829a29dec5d6d1cd85d87f7a8f80a886c2" Dec 01 09:33:52 crc kubenswrapper[4867]: E1201 09:33:52.593797 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3066839a1367abf4a5e34431da0a23829a29dec5d6d1cd85d87f7a8f80a886c2\": container with ID starting with 3066839a1367abf4a5e34431da0a23829a29dec5d6d1cd85d87f7a8f80a886c2 not found: ID does not exist" containerID="3066839a1367abf4a5e34431da0a23829a29dec5d6d1cd85d87f7a8f80a886c2" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.593869 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3066839a1367abf4a5e34431da0a23829a29dec5d6d1cd85d87f7a8f80a886c2"} err="failed to get container status \"3066839a1367abf4a5e34431da0a23829a29dec5d6d1cd85d87f7a8f80a886c2\": rpc error: code = NotFound desc = could not find container \"3066839a1367abf4a5e34431da0a23829a29dec5d6d1cd85d87f7a8f80a886c2\": container with ID starting with 3066839a1367abf4a5e34431da0a23829a29dec5d6d1cd85d87f7a8f80a886c2 not found: ID does not exist" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.593895 4867 scope.go:117] "RemoveContainer" containerID="50ed9997f0143c3b6c9b2627515878a052e8862e43de4adcc11314aa4b53fb12" Dec 01 09:33:52 crc kubenswrapper[4867]: E1201 09:33:52.594795 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50ed9997f0143c3b6c9b2627515878a052e8862e43de4adcc11314aa4b53fb12\": container with ID starting with 50ed9997f0143c3b6c9b2627515878a052e8862e43de4adcc11314aa4b53fb12 not found: ID does not exist" containerID="50ed9997f0143c3b6c9b2627515878a052e8862e43de4adcc11314aa4b53fb12" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.594834 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50ed9997f0143c3b6c9b2627515878a052e8862e43de4adcc11314aa4b53fb12"} err="failed to get container status \"50ed9997f0143c3b6c9b2627515878a052e8862e43de4adcc11314aa4b53fb12\": rpc error: code = NotFound desc = could not find container \"50ed9997f0143c3b6c9b2627515878a052e8862e43de4adcc11314aa4b53fb12\": container with ID starting with 50ed9997f0143c3b6c9b2627515878a052e8862e43de4adcc11314aa4b53fb12 not found: ID does not exist" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.594851 4867 scope.go:117] "RemoveContainer" containerID="7c10630f55a5e3f1966308004b9596564bba3f48b49f2091a432ccd55427b09a" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.611316 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnc5x\" (UniqueName: \"kubernetes.io/projected/c38b1010-0547-4fac-b3e6-ae1f615b6b91-kube-api-access-hnc5x\") pod \"certified-operators-mxjtt\" (UID: \"c38b1010-0547-4fac-b3e6-ae1f615b6b91\") " pod="openshift-marketplace/certified-operators-mxjtt" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.654324 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mxjtt" Dec 01 09:33:52 crc kubenswrapper[4867]: I1201 09:33:52.842418 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b19a5785-1bb3-47f1-a931-7c314aea2f90" path="/var/lib/kubelet/pods/b19a5785-1bb3-47f1-a931-7c314aea2f90/volumes" Dec 01 09:33:53 crc kubenswrapper[4867]: I1201 09:33:53.167460 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mxjtt"] Dec 01 09:33:53 crc kubenswrapper[4867]: I1201 09:33:53.419910 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxjtt" event={"ID":"c38b1010-0547-4fac-b3e6-ae1f615b6b91","Type":"ContainerStarted","Data":"f9ba4226c032038af03a79bbc3faf3356df62f9d52ab28ad2c903a899a5f89c6"} Dec 01 09:33:54 crc kubenswrapper[4867]: I1201 09:33:54.433605 4867 generic.go:334] "Generic (PLEG): container finished" podID="c38b1010-0547-4fac-b3e6-ae1f615b6b91" containerID="722302aaec82d8dce3fc1ce240729bc09a03018aac82be34b7ceefec95fcda60" exitCode=0 Dec 01 09:33:54 crc kubenswrapper[4867]: I1201 09:33:54.433917 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxjtt" event={"ID":"c38b1010-0547-4fac-b3e6-ae1f615b6b91","Type":"ContainerDied","Data":"722302aaec82d8dce3fc1ce240729bc09a03018aac82be34b7ceefec95fcda60"} Dec 01 09:33:55 crc kubenswrapper[4867]: I1201 09:33:55.476522 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxjtt" event={"ID":"c38b1010-0547-4fac-b3e6-ae1f615b6b91","Type":"ContainerStarted","Data":"b501290b1a47504287a22c285a96e05df0ce8f340038e624273ff9218b4b6122"} Dec 01 09:33:58 crc kubenswrapper[4867]: I1201 09:33:58.978305 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm"] Dec 01 09:33:58 crc kubenswrapper[4867]: I1201 09:33:58.982359 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" Dec 01 09:33:58 crc kubenswrapper[4867]: I1201 09:33:58.990702 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm"] Dec 01 09:33:58 crc kubenswrapper[4867]: I1201 09:33:58.993176 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:33:58 crc kubenswrapper[4867]: I1201 09:33:58.993438 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zvcpg" Dec 01 09:33:58 crc kubenswrapper[4867]: I1201 09:33:58.993656 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:33:58 crc kubenswrapper[4867]: I1201 09:33:58.993761 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:33:59 crc kubenswrapper[4867]: I1201 09:33:59.126629 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8a14454-7ca8-4a2d-8626-5234d29dd688-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm\" (UID: \"d8a14454-7ca8-4a2d-8626-5234d29dd688\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" Dec 01 09:33:59 crc kubenswrapper[4867]: I1201 09:33:59.126764 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8a14454-7ca8-4a2d-8626-5234d29dd688-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm\" (UID: \"d8a14454-7ca8-4a2d-8626-5234d29dd688\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" Dec 01 09:33:59 crc kubenswrapper[4867]: I1201 09:33:59.126837 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqdht\" (UniqueName: \"kubernetes.io/projected/d8a14454-7ca8-4a2d-8626-5234d29dd688-kube-api-access-tqdht\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm\" (UID: \"d8a14454-7ca8-4a2d-8626-5234d29dd688\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" Dec 01 09:33:59 crc kubenswrapper[4867]: I1201 09:33:59.126922 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a14454-7ca8-4a2d-8626-5234d29dd688-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm\" (UID: \"d8a14454-7ca8-4a2d-8626-5234d29dd688\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" Dec 01 09:33:59 crc kubenswrapper[4867]: I1201 09:33:59.229200 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a14454-7ca8-4a2d-8626-5234d29dd688-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm\" (UID: \"d8a14454-7ca8-4a2d-8626-5234d29dd688\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" Dec 01 09:33:59 crc kubenswrapper[4867]: I1201 09:33:59.229341 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8a14454-7ca8-4a2d-8626-5234d29dd688-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm\" (UID: \"d8a14454-7ca8-4a2d-8626-5234d29dd688\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" Dec 01 09:33:59 crc kubenswrapper[4867]: I1201 09:33:59.229404 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8a14454-7ca8-4a2d-8626-5234d29dd688-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm\" (UID: \"d8a14454-7ca8-4a2d-8626-5234d29dd688\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" Dec 01 09:33:59 crc kubenswrapper[4867]: I1201 09:33:59.229441 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqdht\" (UniqueName: \"kubernetes.io/projected/d8a14454-7ca8-4a2d-8626-5234d29dd688-kube-api-access-tqdht\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm\" (UID: \"d8a14454-7ca8-4a2d-8626-5234d29dd688\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" Dec 01 09:33:59 crc kubenswrapper[4867]: I1201 09:33:59.235954 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a14454-7ca8-4a2d-8626-5234d29dd688-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm\" (UID: \"d8a14454-7ca8-4a2d-8626-5234d29dd688\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" Dec 01 09:33:59 crc kubenswrapper[4867]: I1201 09:33:59.236540 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8a14454-7ca8-4a2d-8626-5234d29dd688-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm\" (UID: \"d8a14454-7ca8-4a2d-8626-5234d29dd688\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" Dec 01 09:33:59 crc kubenswrapper[4867]: I1201 09:33:59.248530 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8a14454-7ca8-4a2d-8626-5234d29dd688-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm\" (UID: \"d8a14454-7ca8-4a2d-8626-5234d29dd688\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" Dec 01 09:33:59 crc kubenswrapper[4867]: I1201 09:33:59.248836 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqdht\" (UniqueName: \"kubernetes.io/projected/d8a14454-7ca8-4a2d-8626-5234d29dd688-kube-api-access-tqdht\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm\" (UID: \"d8a14454-7ca8-4a2d-8626-5234d29dd688\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" Dec 01 09:33:59 crc kubenswrapper[4867]: I1201 09:33:59.316717 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" Dec 01 09:33:59 crc kubenswrapper[4867]: I1201 09:33:59.521503 4867 generic.go:334] "Generic (PLEG): container finished" podID="c38b1010-0547-4fac-b3e6-ae1f615b6b91" containerID="b501290b1a47504287a22c285a96e05df0ce8f340038e624273ff9218b4b6122" exitCode=0 Dec 01 09:33:59 crc kubenswrapper[4867]: I1201 09:33:59.521896 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxjtt" event={"ID":"c38b1010-0547-4fac-b3e6-ae1f615b6b91","Type":"ContainerDied","Data":"b501290b1a47504287a22c285a96e05df0ce8f340038e624273ff9218b4b6122"} Dec 01 09:34:01 crc kubenswrapper[4867]: I1201 09:34:01.097035 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm"] Dec 01 09:34:01 crc kubenswrapper[4867]: W1201 09:34:01.108318 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8a14454_7ca8_4a2d_8626_5234d29dd688.slice/crio-eb70691c6a3b3d5843b6fa6aa7ab46bce217c9e28168d8326f9be8c3c4302c55 WatchSource:0}: Error finding container eb70691c6a3b3d5843b6fa6aa7ab46bce217c9e28168d8326f9be8c3c4302c55: Status 404 returned error can't find the container with id eb70691c6a3b3d5843b6fa6aa7ab46bce217c9e28168d8326f9be8c3c4302c55 Dec 01 09:34:01 crc kubenswrapper[4867]: I1201 09:34:01.544758 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" event={"ID":"d8a14454-7ca8-4a2d-8626-5234d29dd688","Type":"ContainerStarted","Data":"eb70691c6a3b3d5843b6fa6aa7ab46bce217c9e28168d8326f9be8c3c4302c55"} Dec 01 09:34:01 crc kubenswrapper[4867]: I1201 09:34:01.548274 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxjtt" event={"ID":"c38b1010-0547-4fac-b3e6-ae1f615b6b91","Type":"ContainerStarted","Data":"010ba6e2f5b66aee0c0410bc83c30255ffd4f7a0654e3e9a3ca7d30a45dfc450"} Dec 01 09:34:01 crc kubenswrapper[4867]: I1201 09:34:01.578077 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mxjtt" podStartSLOduration=3.411870821 podStartE2EDuration="9.57805948s" podCreationTimestamp="2025-12-01 09:33:52 +0000 UTC" firstStartedPulling="2025-12-01 09:33:54.435886675 +0000 UTC m=+1555.895273429" lastFinishedPulling="2025-12-01 09:34:00.602075334 +0000 UTC m=+1562.061462088" observedRunningTime="2025-12-01 09:34:01.571999974 +0000 UTC m=+1563.031386738" watchObservedRunningTime="2025-12-01 09:34:01.57805948 +0000 UTC m=+1563.037446234" Dec 01 09:34:02 crc kubenswrapper[4867]: I1201 09:34:02.655114 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mxjtt" Dec 01 09:34:02 crc kubenswrapper[4867]: I1201 09:34:02.655171 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mxjtt" Dec 01 09:34:02 crc kubenswrapper[4867]: I1201 09:34:02.993050 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 01 09:34:03 crc kubenswrapper[4867]: I1201 09:34:03.328090 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:34:03 crc kubenswrapper[4867]: I1201 09:34:03.727526 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-mxjtt" podUID="c38b1010-0547-4fac-b3e6-ae1f615b6b91" containerName="registry-server" probeResult="failure" output=< Dec 01 09:34:03 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Dec 01 09:34:03 crc kubenswrapper[4867]: > Dec 01 09:34:05 crc kubenswrapper[4867]: I1201 09:34:05.829408 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:34:05 crc kubenswrapper[4867]: E1201 09:34:05.830025 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:34:08 crc kubenswrapper[4867]: I1201 09:34:08.887977 4867 scope.go:117] "RemoveContainer" containerID="c12cd2cba9dafa9c7cf8aa656c006db0c49575f4b14ae890016ee20a169bbe14" Dec 01 09:34:13 crc kubenswrapper[4867]: I1201 09:34:13.790600 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-mxjtt" podUID="c38b1010-0547-4fac-b3e6-ae1f615b6b91" containerName="registry-server" probeResult="failure" output=< Dec 01 09:34:13 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Dec 01 09:34:13 crc kubenswrapper[4867]: > Dec 01 09:34:16 crc kubenswrapper[4867]: I1201 09:34:16.042298 4867 scope.go:117] "RemoveContainer" containerID="7e00294f2f5021f38a5460df958b260deab9be5063fbc31d2cf6db4058f6c3b8" Dec 01 09:34:19 crc kubenswrapper[4867]: I1201 09:34:19.750108 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" event={"ID":"d8a14454-7ca8-4a2d-8626-5234d29dd688","Type":"ContainerStarted","Data":"5d424380eab549af375882944fc5af7162e666778ff1a0fee2abd448e8cdab32"} Dec 01 09:34:19 crc kubenswrapper[4867]: I1201 09:34:19.779755 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" podStartSLOduration=4.004634424 podStartE2EDuration="21.779723834s" podCreationTimestamp="2025-12-01 09:33:58 +0000 UTC" firstStartedPulling="2025-12-01 09:34:01.110716834 +0000 UTC m=+1562.570103588" lastFinishedPulling="2025-12-01 09:34:18.885806244 +0000 UTC m=+1580.345192998" observedRunningTime="2025-12-01 09:34:19.771568441 +0000 UTC m=+1581.230955215" watchObservedRunningTime="2025-12-01 09:34:19.779723834 +0000 UTC m=+1581.239110588" Dec 01 09:34:20 crc kubenswrapper[4867]: I1201 09:34:20.832200 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:34:20 crc kubenswrapper[4867]: E1201 09:34:20.832735 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:34:22 crc kubenswrapper[4867]: I1201 09:34:22.711311 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mxjtt" Dec 01 09:34:22 crc kubenswrapper[4867]: I1201 09:34:22.768559 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mxjtt" Dec 01 09:34:23 crc kubenswrapper[4867]: I1201 09:34:23.545669 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mxjtt"] Dec 01 09:34:23 crc kubenswrapper[4867]: I1201 09:34:23.785228 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mxjtt" podUID="c38b1010-0547-4fac-b3e6-ae1f615b6b91" containerName="registry-server" containerID="cri-o://010ba6e2f5b66aee0c0410bc83c30255ffd4f7a0654e3e9a3ca7d30a45dfc450" gracePeriod=2 Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.462553 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mxjtt" Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.645092 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c38b1010-0547-4fac-b3e6-ae1f615b6b91-catalog-content\") pod \"c38b1010-0547-4fac-b3e6-ae1f615b6b91\" (UID: \"c38b1010-0547-4fac-b3e6-ae1f615b6b91\") " Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.645429 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c38b1010-0547-4fac-b3e6-ae1f615b6b91-utilities\") pod \"c38b1010-0547-4fac-b3e6-ae1f615b6b91\" (UID: \"c38b1010-0547-4fac-b3e6-ae1f615b6b91\") " Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.645515 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnc5x\" (UniqueName: \"kubernetes.io/projected/c38b1010-0547-4fac-b3e6-ae1f615b6b91-kube-api-access-hnc5x\") pod \"c38b1010-0547-4fac-b3e6-ae1f615b6b91\" (UID: \"c38b1010-0547-4fac-b3e6-ae1f615b6b91\") " Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.646692 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c38b1010-0547-4fac-b3e6-ae1f615b6b91-utilities" (OuterVolumeSpecName: "utilities") pod "c38b1010-0547-4fac-b3e6-ae1f615b6b91" (UID: "c38b1010-0547-4fac-b3e6-ae1f615b6b91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.669740 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c38b1010-0547-4fac-b3e6-ae1f615b6b91-kube-api-access-hnc5x" (OuterVolumeSpecName: "kube-api-access-hnc5x") pod "c38b1010-0547-4fac-b3e6-ae1f615b6b91" (UID: "c38b1010-0547-4fac-b3e6-ae1f615b6b91"). InnerVolumeSpecName "kube-api-access-hnc5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.713493 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c38b1010-0547-4fac-b3e6-ae1f615b6b91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c38b1010-0547-4fac-b3e6-ae1f615b6b91" (UID: "c38b1010-0547-4fac-b3e6-ae1f615b6b91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.748422 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnc5x\" (UniqueName: \"kubernetes.io/projected/c38b1010-0547-4fac-b3e6-ae1f615b6b91-kube-api-access-hnc5x\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.748479 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c38b1010-0547-4fac-b3e6-ae1f615b6b91-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.748489 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c38b1010-0547-4fac-b3e6-ae1f615b6b91-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.800670 4867 generic.go:334] "Generic (PLEG): container finished" podID="c38b1010-0547-4fac-b3e6-ae1f615b6b91" containerID="010ba6e2f5b66aee0c0410bc83c30255ffd4f7a0654e3e9a3ca7d30a45dfc450" exitCode=0 Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.800741 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxjtt" event={"ID":"c38b1010-0547-4fac-b3e6-ae1f615b6b91","Type":"ContainerDied","Data":"010ba6e2f5b66aee0c0410bc83c30255ffd4f7a0654e3e9a3ca7d30a45dfc450"} Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.800772 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxjtt" event={"ID":"c38b1010-0547-4fac-b3e6-ae1f615b6b91","Type":"ContainerDied","Data":"f9ba4226c032038af03a79bbc3faf3356df62f9d52ab28ad2c903a899a5f89c6"} Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.800793 4867 scope.go:117] "RemoveContainer" containerID="010ba6e2f5b66aee0c0410bc83c30255ffd4f7a0654e3e9a3ca7d30a45dfc450" Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.800959 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mxjtt" Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.859100 4867 scope.go:117] "RemoveContainer" containerID="b501290b1a47504287a22c285a96e05df0ce8f340038e624273ff9218b4b6122" Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.890494 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mxjtt"] Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.890705 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mxjtt"] Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.914404 4867 scope.go:117] "RemoveContainer" containerID="722302aaec82d8dce3fc1ce240729bc09a03018aac82be34b7ceefec95fcda60" Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.954863 4867 scope.go:117] "RemoveContainer" containerID="010ba6e2f5b66aee0c0410bc83c30255ffd4f7a0654e3e9a3ca7d30a45dfc450" Dec 01 09:34:24 crc kubenswrapper[4867]: E1201 09:34:24.955921 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"010ba6e2f5b66aee0c0410bc83c30255ffd4f7a0654e3e9a3ca7d30a45dfc450\": container with ID starting with 010ba6e2f5b66aee0c0410bc83c30255ffd4f7a0654e3e9a3ca7d30a45dfc450 not found: ID does not exist" containerID="010ba6e2f5b66aee0c0410bc83c30255ffd4f7a0654e3e9a3ca7d30a45dfc450" Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.955956 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"010ba6e2f5b66aee0c0410bc83c30255ffd4f7a0654e3e9a3ca7d30a45dfc450"} err="failed to get container status \"010ba6e2f5b66aee0c0410bc83c30255ffd4f7a0654e3e9a3ca7d30a45dfc450\": rpc error: code = NotFound desc = could not find container \"010ba6e2f5b66aee0c0410bc83c30255ffd4f7a0654e3e9a3ca7d30a45dfc450\": container with ID starting with 010ba6e2f5b66aee0c0410bc83c30255ffd4f7a0654e3e9a3ca7d30a45dfc450 not found: ID does not exist" Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.955976 4867 scope.go:117] "RemoveContainer" containerID="b501290b1a47504287a22c285a96e05df0ce8f340038e624273ff9218b4b6122" Dec 01 09:34:24 crc kubenswrapper[4867]: E1201 09:34:24.956253 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b501290b1a47504287a22c285a96e05df0ce8f340038e624273ff9218b4b6122\": container with ID starting with b501290b1a47504287a22c285a96e05df0ce8f340038e624273ff9218b4b6122 not found: ID does not exist" containerID="b501290b1a47504287a22c285a96e05df0ce8f340038e624273ff9218b4b6122" Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.956286 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b501290b1a47504287a22c285a96e05df0ce8f340038e624273ff9218b4b6122"} err="failed to get container status \"b501290b1a47504287a22c285a96e05df0ce8f340038e624273ff9218b4b6122\": rpc error: code = NotFound desc = could not find container \"b501290b1a47504287a22c285a96e05df0ce8f340038e624273ff9218b4b6122\": container with ID starting with b501290b1a47504287a22c285a96e05df0ce8f340038e624273ff9218b4b6122 not found: ID does not exist" Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.956303 4867 scope.go:117] "RemoveContainer" containerID="722302aaec82d8dce3fc1ce240729bc09a03018aac82be34b7ceefec95fcda60" Dec 01 09:34:24 crc kubenswrapper[4867]: E1201 09:34:24.956555 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"722302aaec82d8dce3fc1ce240729bc09a03018aac82be34b7ceefec95fcda60\": container with ID starting with 722302aaec82d8dce3fc1ce240729bc09a03018aac82be34b7ceefec95fcda60 not found: ID does not exist" containerID="722302aaec82d8dce3fc1ce240729bc09a03018aac82be34b7ceefec95fcda60" Dec 01 09:34:24 crc kubenswrapper[4867]: I1201 09:34:24.956573 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"722302aaec82d8dce3fc1ce240729bc09a03018aac82be34b7ceefec95fcda60"} err="failed to get container status \"722302aaec82d8dce3fc1ce240729bc09a03018aac82be34b7ceefec95fcda60\": rpc error: code = NotFound desc = could not find container \"722302aaec82d8dce3fc1ce240729bc09a03018aac82be34b7ceefec95fcda60\": container with ID starting with 722302aaec82d8dce3fc1ce240729bc09a03018aac82be34b7ceefec95fcda60 not found: ID does not exist" Dec 01 09:34:26 crc kubenswrapper[4867]: I1201 09:34:26.838369 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c38b1010-0547-4fac-b3e6-ae1f615b6b91" path="/var/lib/kubelet/pods/c38b1010-0547-4fac-b3e6-ae1f615b6b91/volumes" Dec 01 09:34:33 crc kubenswrapper[4867]: I1201 09:34:33.827772 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:34:33 crc kubenswrapper[4867]: E1201 09:34:33.828592 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:34:36 crc kubenswrapper[4867]: I1201 09:34:36.919710 4867 generic.go:334] "Generic (PLEG): container finished" podID="d8a14454-7ca8-4a2d-8626-5234d29dd688" containerID="5d424380eab549af375882944fc5af7162e666778ff1a0fee2abd448e8cdab32" exitCode=0 Dec 01 09:34:36 crc kubenswrapper[4867]: I1201 09:34:36.919785 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" event={"ID":"d8a14454-7ca8-4a2d-8626-5234d29dd688","Type":"ContainerDied","Data":"5d424380eab549af375882944fc5af7162e666778ff1a0fee2abd448e8cdab32"} Dec 01 09:34:38 crc kubenswrapper[4867]: I1201 09:34:38.330388 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" Dec 01 09:34:38 crc kubenswrapper[4867]: I1201 09:34:38.502154 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a14454-7ca8-4a2d-8626-5234d29dd688-repo-setup-combined-ca-bundle\") pod \"d8a14454-7ca8-4a2d-8626-5234d29dd688\" (UID: \"d8a14454-7ca8-4a2d-8626-5234d29dd688\") " Dec 01 09:34:38 crc kubenswrapper[4867]: I1201 09:34:38.502368 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8a14454-7ca8-4a2d-8626-5234d29dd688-inventory\") pod \"d8a14454-7ca8-4a2d-8626-5234d29dd688\" (UID: \"d8a14454-7ca8-4a2d-8626-5234d29dd688\") " Dec 01 09:34:38 crc kubenswrapper[4867]: I1201 09:34:38.502423 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqdht\" (UniqueName: \"kubernetes.io/projected/d8a14454-7ca8-4a2d-8626-5234d29dd688-kube-api-access-tqdht\") pod \"d8a14454-7ca8-4a2d-8626-5234d29dd688\" (UID: \"d8a14454-7ca8-4a2d-8626-5234d29dd688\") " Dec 01 09:34:38 crc kubenswrapper[4867]: I1201 09:34:38.502492 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8a14454-7ca8-4a2d-8626-5234d29dd688-ssh-key\") pod \"d8a14454-7ca8-4a2d-8626-5234d29dd688\" (UID: \"d8a14454-7ca8-4a2d-8626-5234d29dd688\") " Dec 01 09:34:38 crc kubenswrapper[4867]: I1201 09:34:38.508737 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8a14454-7ca8-4a2d-8626-5234d29dd688-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d8a14454-7ca8-4a2d-8626-5234d29dd688" (UID: "d8a14454-7ca8-4a2d-8626-5234d29dd688"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:38 crc kubenswrapper[4867]: I1201 09:34:38.516377 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8a14454-7ca8-4a2d-8626-5234d29dd688-kube-api-access-tqdht" (OuterVolumeSpecName: "kube-api-access-tqdht") pod "d8a14454-7ca8-4a2d-8626-5234d29dd688" (UID: "d8a14454-7ca8-4a2d-8626-5234d29dd688"). InnerVolumeSpecName "kube-api-access-tqdht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:38 crc kubenswrapper[4867]: I1201 09:34:38.541623 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8a14454-7ca8-4a2d-8626-5234d29dd688-inventory" (OuterVolumeSpecName: "inventory") pod "d8a14454-7ca8-4a2d-8626-5234d29dd688" (UID: "d8a14454-7ca8-4a2d-8626-5234d29dd688"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:38 crc kubenswrapper[4867]: I1201 09:34:38.542264 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8a14454-7ca8-4a2d-8626-5234d29dd688-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d8a14454-7ca8-4a2d-8626-5234d29dd688" (UID: "d8a14454-7ca8-4a2d-8626-5234d29dd688"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:38 crc kubenswrapper[4867]: I1201 09:34:38.607018 4867 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a14454-7ca8-4a2d-8626-5234d29dd688-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:38 crc kubenswrapper[4867]: I1201 09:34:38.607080 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8a14454-7ca8-4a2d-8626-5234d29dd688-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:38 crc kubenswrapper[4867]: I1201 09:34:38.607100 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqdht\" (UniqueName: \"kubernetes.io/projected/d8a14454-7ca8-4a2d-8626-5234d29dd688-kube-api-access-tqdht\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:38 crc kubenswrapper[4867]: I1201 09:34:38.607116 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8a14454-7ca8-4a2d-8626-5234d29dd688-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:38 crc kubenswrapper[4867]: I1201 09:34:38.949448 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" event={"ID":"d8a14454-7ca8-4a2d-8626-5234d29dd688","Type":"ContainerDied","Data":"eb70691c6a3b3d5843b6fa6aa7ab46bce217c9e28168d8326f9be8c3c4302c55"} Dec 01 09:34:38 crc kubenswrapper[4867]: I1201 09:34:38.950223 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb70691c6a3b3d5843b6fa6aa7ab46bce217c9e28168d8326f9be8c3c4302c55" Dec 01 09:34:38 crc kubenswrapper[4867]: I1201 09:34:38.949726 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.055608 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pk2lp"] Dec 01 09:34:39 crc kubenswrapper[4867]: E1201 09:34:39.056217 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a14454-7ca8-4a2d-8626-5234d29dd688" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.056243 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a14454-7ca8-4a2d-8626-5234d29dd688" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 09:34:39 crc kubenswrapper[4867]: E1201 09:34:39.056267 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c38b1010-0547-4fac-b3e6-ae1f615b6b91" containerName="extract-utilities" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.056279 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c38b1010-0547-4fac-b3e6-ae1f615b6b91" containerName="extract-utilities" Dec 01 09:34:39 crc kubenswrapper[4867]: E1201 09:34:39.056302 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c38b1010-0547-4fac-b3e6-ae1f615b6b91" containerName="extract-content" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.056311 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c38b1010-0547-4fac-b3e6-ae1f615b6b91" containerName="extract-content" Dec 01 09:34:39 crc kubenswrapper[4867]: E1201 09:34:39.056337 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c38b1010-0547-4fac-b3e6-ae1f615b6b91" containerName="registry-server" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.056344 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c38b1010-0547-4fac-b3e6-ae1f615b6b91" containerName="registry-server" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.056592 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a14454-7ca8-4a2d-8626-5234d29dd688" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.056613 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c38b1010-0547-4fac-b3e6-ae1f615b6b91" containerName="registry-server" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.057401 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pk2lp" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.060794 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.060939 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zvcpg" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.060794 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.061109 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.068281 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pk2lp"] Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.216738 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1560ec78-ac43-47a7-ab73-69a7decf4ed8-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pk2lp\" (UID: \"1560ec78-ac43-47a7-ab73-69a7decf4ed8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pk2lp" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.217253 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgx6f\" (UniqueName: \"kubernetes.io/projected/1560ec78-ac43-47a7-ab73-69a7decf4ed8-kube-api-access-hgx6f\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pk2lp\" (UID: \"1560ec78-ac43-47a7-ab73-69a7decf4ed8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pk2lp" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.217398 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1560ec78-ac43-47a7-ab73-69a7decf4ed8-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pk2lp\" (UID: \"1560ec78-ac43-47a7-ab73-69a7decf4ed8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pk2lp" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.319397 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgx6f\" (UniqueName: \"kubernetes.io/projected/1560ec78-ac43-47a7-ab73-69a7decf4ed8-kube-api-access-hgx6f\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pk2lp\" (UID: \"1560ec78-ac43-47a7-ab73-69a7decf4ed8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pk2lp" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.319715 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1560ec78-ac43-47a7-ab73-69a7decf4ed8-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pk2lp\" (UID: \"1560ec78-ac43-47a7-ab73-69a7decf4ed8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pk2lp" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.319899 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1560ec78-ac43-47a7-ab73-69a7decf4ed8-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pk2lp\" (UID: \"1560ec78-ac43-47a7-ab73-69a7decf4ed8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pk2lp" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.326326 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1560ec78-ac43-47a7-ab73-69a7decf4ed8-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pk2lp\" (UID: \"1560ec78-ac43-47a7-ab73-69a7decf4ed8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pk2lp" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.337360 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1560ec78-ac43-47a7-ab73-69a7decf4ed8-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pk2lp\" (UID: \"1560ec78-ac43-47a7-ab73-69a7decf4ed8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pk2lp" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.337829 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgx6f\" (UniqueName: \"kubernetes.io/projected/1560ec78-ac43-47a7-ab73-69a7decf4ed8-kube-api-access-hgx6f\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pk2lp\" (UID: \"1560ec78-ac43-47a7-ab73-69a7decf4ed8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pk2lp" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.379601 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pk2lp" Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.919847 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pk2lp"] Dec 01 09:34:39 crc kubenswrapper[4867]: I1201 09:34:39.969830 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pk2lp" event={"ID":"1560ec78-ac43-47a7-ab73-69a7decf4ed8","Type":"ContainerStarted","Data":"7c282097a9c79f7c66ae31ef1bc0bdaa9c05a318051bbed982eae7241404b563"} Dec 01 09:34:40 crc kubenswrapper[4867]: I1201 09:34:40.979921 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pk2lp" event={"ID":"1560ec78-ac43-47a7-ab73-69a7decf4ed8","Type":"ContainerStarted","Data":"f876f7557a314df822fee45c64b45d99df4a06a23d81d7dd835ec45c95d221e8"} Dec 01 09:34:44 crc kubenswrapper[4867]: I1201 09:34:44.008108 4867 generic.go:334] "Generic (PLEG): container finished" podID="1560ec78-ac43-47a7-ab73-69a7decf4ed8" containerID="f876f7557a314df822fee45c64b45d99df4a06a23d81d7dd835ec45c95d221e8" exitCode=0 Dec 01 09:34:44 crc kubenswrapper[4867]: I1201 09:34:44.008175 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pk2lp" event={"ID":"1560ec78-ac43-47a7-ab73-69a7decf4ed8","Type":"ContainerDied","Data":"f876f7557a314df822fee45c64b45d99df4a06a23d81d7dd835ec45c95d221e8"} Dec 01 09:34:45 crc kubenswrapper[4867]: I1201 09:34:45.465176 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pk2lp" Dec 01 09:34:45 crc kubenswrapper[4867]: I1201 09:34:45.635093 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgx6f\" (UniqueName: \"kubernetes.io/projected/1560ec78-ac43-47a7-ab73-69a7decf4ed8-kube-api-access-hgx6f\") pod \"1560ec78-ac43-47a7-ab73-69a7decf4ed8\" (UID: \"1560ec78-ac43-47a7-ab73-69a7decf4ed8\") " Dec 01 09:34:45 crc kubenswrapper[4867]: I1201 09:34:45.635196 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1560ec78-ac43-47a7-ab73-69a7decf4ed8-ssh-key\") pod \"1560ec78-ac43-47a7-ab73-69a7decf4ed8\" (UID: \"1560ec78-ac43-47a7-ab73-69a7decf4ed8\") " Dec 01 09:34:45 crc kubenswrapper[4867]: I1201 09:34:45.635420 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1560ec78-ac43-47a7-ab73-69a7decf4ed8-inventory\") pod \"1560ec78-ac43-47a7-ab73-69a7decf4ed8\" (UID: \"1560ec78-ac43-47a7-ab73-69a7decf4ed8\") " Dec 01 09:34:45 crc kubenswrapper[4867]: I1201 09:34:45.647944 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1560ec78-ac43-47a7-ab73-69a7decf4ed8-kube-api-access-hgx6f" (OuterVolumeSpecName: "kube-api-access-hgx6f") pod "1560ec78-ac43-47a7-ab73-69a7decf4ed8" (UID: "1560ec78-ac43-47a7-ab73-69a7decf4ed8"). InnerVolumeSpecName "kube-api-access-hgx6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:45 crc kubenswrapper[4867]: I1201 09:34:45.667376 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1560ec78-ac43-47a7-ab73-69a7decf4ed8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1560ec78-ac43-47a7-ab73-69a7decf4ed8" (UID: "1560ec78-ac43-47a7-ab73-69a7decf4ed8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:45 crc kubenswrapper[4867]: I1201 09:34:45.683362 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1560ec78-ac43-47a7-ab73-69a7decf4ed8-inventory" (OuterVolumeSpecName: "inventory") pod "1560ec78-ac43-47a7-ab73-69a7decf4ed8" (UID: "1560ec78-ac43-47a7-ab73-69a7decf4ed8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:45 crc kubenswrapper[4867]: I1201 09:34:45.737714 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgx6f\" (UniqueName: \"kubernetes.io/projected/1560ec78-ac43-47a7-ab73-69a7decf4ed8-kube-api-access-hgx6f\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:45 crc kubenswrapper[4867]: I1201 09:34:45.737741 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1560ec78-ac43-47a7-ab73-69a7decf4ed8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:45 crc kubenswrapper[4867]: I1201 09:34:45.737751 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1560ec78-ac43-47a7-ab73-69a7decf4ed8-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:45 crc kubenswrapper[4867]: I1201 09:34:45.827149 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:34:45 crc kubenswrapper[4867]: E1201 09:34:45.827516 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.028502 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pk2lp" event={"ID":"1560ec78-ac43-47a7-ab73-69a7decf4ed8","Type":"ContainerDied","Data":"7c282097a9c79f7c66ae31ef1bc0bdaa9c05a318051bbed982eae7241404b563"} Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.028545 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c282097a9c79f7c66ae31ef1bc0bdaa9c05a318051bbed982eae7241404b563" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.028600 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pk2lp" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.112981 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw"] Dec 01 09:34:46 crc kubenswrapper[4867]: E1201 09:34:46.113462 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1560ec78-ac43-47a7-ab73-69a7decf4ed8" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.113488 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1560ec78-ac43-47a7-ab73-69a7decf4ed8" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.113774 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1560ec78-ac43-47a7-ab73-69a7decf4ed8" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.114499 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.117997 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.118127 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.119621 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.122836 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zvcpg" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.123731 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw"] Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.250182 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a32a973f-6473-444b-a71a-d848773d8de2-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw\" (UID: \"a32a973f-6473-444b-a71a-d848773d8de2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.250381 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a32a973f-6473-444b-a71a-d848773d8de2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw\" (UID: \"a32a973f-6473-444b-a71a-d848773d8de2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.250542 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkfbj\" (UniqueName: \"kubernetes.io/projected/a32a973f-6473-444b-a71a-d848773d8de2-kube-api-access-jkfbj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw\" (UID: \"a32a973f-6473-444b-a71a-d848773d8de2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.250582 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a32a973f-6473-444b-a71a-d848773d8de2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw\" (UID: \"a32a973f-6473-444b-a71a-d848773d8de2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.352741 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a32a973f-6473-444b-a71a-d848773d8de2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw\" (UID: \"a32a973f-6473-444b-a71a-d848773d8de2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.352849 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkfbj\" (UniqueName: \"kubernetes.io/projected/a32a973f-6473-444b-a71a-d848773d8de2-kube-api-access-jkfbj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw\" (UID: \"a32a973f-6473-444b-a71a-d848773d8de2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.352889 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a32a973f-6473-444b-a71a-d848773d8de2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw\" (UID: \"a32a973f-6473-444b-a71a-d848773d8de2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.352953 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a32a973f-6473-444b-a71a-d848773d8de2-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw\" (UID: \"a32a973f-6473-444b-a71a-d848773d8de2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.376680 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a32a973f-6473-444b-a71a-d848773d8de2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw\" (UID: \"a32a973f-6473-444b-a71a-d848773d8de2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.379835 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a32a973f-6473-444b-a71a-d848773d8de2-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw\" (UID: \"a32a973f-6473-444b-a71a-d848773d8de2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.380856 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkfbj\" (UniqueName: \"kubernetes.io/projected/a32a973f-6473-444b-a71a-d848773d8de2-kube-api-access-jkfbj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw\" (UID: \"a32a973f-6473-444b-a71a-d848773d8de2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.388323 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a32a973f-6473-444b-a71a-d848773d8de2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw\" (UID: \"a32a973f-6473-444b-a71a-d848773d8de2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" Dec 01 09:34:46 crc kubenswrapper[4867]: I1201 09:34:46.432617 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" Dec 01 09:34:47 crc kubenswrapper[4867]: I1201 09:34:47.152879 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw"] Dec 01 09:34:48 crc kubenswrapper[4867]: I1201 09:34:48.055263 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" event={"ID":"a32a973f-6473-444b-a71a-d848773d8de2","Type":"ContainerStarted","Data":"5203394fff396f35fb28c482a17a0cc0391c06ec5d09b4bcb5343d5a29b7f39a"} Dec 01 09:34:48 crc kubenswrapper[4867]: I1201 09:34:48.055634 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" event={"ID":"a32a973f-6473-444b-a71a-d848773d8de2","Type":"ContainerStarted","Data":"f906d35c0dbc29840a7396d102338fb0a7257f09bd107f588cb601cd08268fe6"} Dec 01 09:34:48 crc kubenswrapper[4867]: I1201 09:34:48.076837 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" podStartSLOduration=1.851109128 podStartE2EDuration="2.076797691s" podCreationTimestamp="2025-12-01 09:34:46 +0000 UTC" firstStartedPulling="2025-12-01 09:34:47.167029444 +0000 UTC m=+1608.626416198" lastFinishedPulling="2025-12-01 09:34:47.392718007 +0000 UTC m=+1608.852104761" observedRunningTime="2025-12-01 09:34:48.071749062 +0000 UTC m=+1609.531135816" watchObservedRunningTime="2025-12-01 09:34:48.076797691 +0000 UTC m=+1609.536184455" Dec 01 09:34:58 crc kubenswrapper[4867]: I1201 09:34:58.837095 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:34:58 crc kubenswrapper[4867]: E1201 09:34:58.838168 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:34:59 crc kubenswrapper[4867]: I1201 09:34:59.270288 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hnxjh"] Dec 01 09:34:59 crc kubenswrapper[4867]: I1201 09:34:59.272345 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hnxjh" Dec 01 09:34:59 crc kubenswrapper[4867]: I1201 09:34:59.290522 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hnxjh"] Dec 01 09:34:59 crc kubenswrapper[4867]: I1201 09:34:59.413671 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkt5v\" (UniqueName: \"kubernetes.io/projected/de64ea35-2fcf-4474-b7e9-891529a07a00-kube-api-access-wkt5v\") pod \"community-operators-hnxjh\" (UID: \"de64ea35-2fcf-4474-b7e9-891529a07a00\") " pod="openshift-marketplace/community-operators-hnxjh" Dec 01 09:34:59 crc kubenswrapper[4867]: I1201 09:34:59.414054 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de64ea35-2fcf-4474-b7e9-891529a07a00-catalog-content\") pod \"community-operators-hnxjh\" (UID: \"de64ea35-2fcf-4474-b7e9-891529a07a00\") " pod="openshift-marketplace/community-operators-hnxjh" Dec 01 09:34:59 crc kubenswrapper[4867]: I1201 09:34:59.414110 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de64ea35-2fcf-4474-b7e9-891529a07a00-utilities\") pod \"community-operators-hnxjh\" (UID: \"de64ea35-2fcf-4474-b7e9-891529a07a00\") " pod="openshift-marketplace/community-operators-hnxjh" Dec 01 09:34:59 crc kubenswrapper[4867]: I1201 09:34:59.515907 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkt5v\" (UniqueName: \"kubernetes.io/projected/de64ea35-2fcf-4474-b7e9-891529a07a00-kube-api-access-wkt5v\") pod \"community-operators-hnxjh\" (UID: \"de64ea35-2fcf-4474-b7e9-891529a07a00\") " pod="openshift-marketplace/community-operators-hnxjh" Dec 01 09:34:59 crc kubenswrapper[4867]: I1201 09:34:59.520700 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de64ea35-2fcf-4474-b7e9-891529a07a00-catalog-content\") pod \"community-operators-hnxjh\" (UID: \"de64ea35-2fcf-4474-b7e9-891529a07a00\") " pod="openshift-marketplace/community-operators-hnxjh" Dec 01 09:34:59 crc kubenswrapper[4867]: I1201 09:34:59.520863 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de64ea35-2fcf-4474-b7e9-891529a07a00-utilities\") pod \"community-operators-hnxjh\" (UID: \"de64ea35-2fcf-4474-b7e9-891529a07a00\") " pod="openshift-marketplace/community-operators-hnxjh" Dec 01 09:34:59 crc kubenswrapper[4867]: I1201 09:34:59.521516 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de64ea35-2fcf-4474-b7e9-891529a07a00-catalog-content\") pod \"community-operators-hnxjh\" (UID: \"de64ea35-2fcf-4474-b7e9-891529a07a00\") " pod="openshift-marketplace/community-operators-hnxjh" Dec 01 09:34:59 crc kubenswrapper[4867]: I1201 09:34:59.521562 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de64ea35-2fcf-4474-b7e9-891529a07a00-utilities\") pod \"community-operators-hnxjh\" (UID: \"de64ea35-2fcf-4474-b7e9-891529a07a00\") " pod="openshift-marketplace/community-operators-hnxjh" Dec 01 09:34:59 crc kubenswrapper[4867]: I1201 09:34:59.537938 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkt5v\" (UniqueName: \"kubernetes.io/projected/de64ea35-2fcf-4474-b7e9-891529a07a00-kube-api-access-wkt5v\") pod \"community-operators-hnxjh\" (UID: \"de64ea35-2fcf-4474-b7e9-891529a07a00\") " pod="openshift-marketplace/community-operators-hnxjh" Dec 01 09:34:59 crc kubenswrapper[4867]: I1201 09:34:59.594931 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hnxjh" Dec 01 09:35:00 crc kubenswrapper[4867]: I1201 09:35:00.179066 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hnxjh"] Dec 01 09:35:01 crc kubenswrapper[4867]: I1201 09:35:01.209797 4867 generic.go:334] "Generic (PLEG): container finished" podID="de64ea35-2fcf-4474-b7e9-891529a07a00" containerID="7d08f0e183bba7e6ee37916e0c93474c3d5ccfb3c4ac66c29953a8659720983e" exitCode=0 Dec 01 09:35:01 crc kubenswrapper[4867]: I1201 09:35:01.210011 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnxjh" event={"ID":"de64ea35-2fcf-4474-b7e9-891529a07a00","Type":"ContainerDied","Data":"7d08f0e183bba7e6ee37916e0c93474c3d5ccfb3c4ac66c29953a8659720983e"} Dec 01 09:35:01 crc kubenswrapper[4867]: I1201 09:35:01.210129 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnxjh" event={"ID":"de64ea35-2fcf-4474-b7e9-891529a07a00","Type":"ContainerStarted","Data":"5dd89787656b4240060a4a906e66f5ff72b9b92286cdeeb9f022da2d3f4959bc"} Dec 01 09:35:03 crc kubenswrapper[4867]: I1201 09:35:03.271604 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnxjh" event={"ID":"de64ea35-2fcf-4474-b7e9-891529a07a00","Type":"ContainerStarted","Data":"950b126b8d7aca809b3599a5c5a37b7e52cf33c73557f7310de5bb3c484418f2"} Dec 01 09:35:05 crc kubenswrapper[4867]: I1201 09:35:05.292754 4867 generic.go:334] "Generic (PLEG): container finished" podID="de64ea35-2fcf-4474-b7e9-891529a07a00" containerID="950b126b8d7aca809b3599a5c5a37b7e52cf33c73557f7310de5bb3c484418f2" exitCode=0 Dec 01 09:35:05 crc kubenswrapper[4867]: I1201 09:35:05.292796 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnxjh" event={"ID":"de64ea35-2fcf-4474-b7e9-891529a07a00","Type":"ContainerDied","Data":"950b126b8d7aca809b3599a5c5a37b7e52cf33c73557f7310de5bb3c484418f2"} Dec 01 09:35:07 crc kubenswrapper[4867]: I1201 09:35:07.318125 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnxjh" event={"ID":"de64ea35-2fcf-4474-b7e9-891529a07a00","Type":"ContainerStarted","Data":"bfce55f548f1bc66ec19af2d9fda50690b9addd9eee5aa9270200a2e54805c02"} Dec 01 09:35:07 crc kubenswrapper[4867]: I1201 09:35:07.347468 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hnxjh" podStartSLOduration=3.243959269 podStartE2EDuration="8.347445939s" podCreationTimestamp="2025-12-01 09:34:59 +0000 UTC" firstStartedPulling="2025-12-01 09:35:01.215865081 +0000 UTC m=+1622.675251835" lastFinishedPulling="2025-12-01 09:35:06.319351751 +0000 UTC m=+1627.778738505" observedRunningTime="2025-12-01 09:35:07.337945077 +0000 UTC m=+1628.797331851" watchObservedRunningTime="2025-12-01 09:35:07.347445939 +0000 UTC m=+1628.806832693" Dec 01 09:35:09 crc kubenswrapper[4867]: I1201 09:35:09.596083 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hnxjh" Dec 01 09:35:09 crc kubenswrapper[4867]: I1201 09:35:09.596135 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hnxjh" Dec 01 09:35:09 crc kubenswrapper[4867]: I1201 09:35:09.651125 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hnxjh" Dec 01 09:35:09 crc kubenswrapper[4867]: I1201 09:35:09.827570 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:35:09 crc kubenswrapper[4867]: E1201 09:35:09.827784 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:35:18 crc kubenswrapper[4867]: I1201 09:35:18.949551 4867 scope.go:117] "RemoveContainer" containerID="972971977dedf38a39bb58786ff5bf497a0a17d55c293f2e13921de6020e1309" Dec 01 09:35:18 crc kubenswrapper[4867]: I1201 09:35:18.976953 4867 scope.go:117] "RemoveContainer" containerID="337f0022c97a7421961b20b0f515b48fb2fc7f68bcfdc9bbbccf360285a5322f" Dec 01 09:35:19 crc kubenswrapper[4867]: I1201 09:35:19.648542 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hnxjh" Dec 01 09:35:19 crc kubenswrapper[4867]: I1201 09:35:19.701493 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hnxjh"] Dec 01 09:35:20 crc kubenswrapper[4867]: I1201 09:35:20.428693 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hnxjh" podUID="de64ea35-2fcf-4474-b7e9-891529a07a00" containerName="registry-server" containerID="cri-o://bfce55f548f1bc66ec19af2d9fda50690b9addd9eee5aa9270200a2e54805c02" gracePeriod=2 Dec 01 09:35:20 crc kubenswrapper[4867]: I1201 09:35:20.895006 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hnxjh" Dec 01 09:35:20 crc kubenswrapper[4867]: I1201 09:35:20.986148 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkt5v\" (UniqueName: \"kubernetes.io/projected/de64ea35-2fcf-4474-b7e9-891529a07a00-kube-api-access-wkt5v\") pod \"de64ea35-2fcf-4474-b7e9-891529a07a00\" (UID: \"de64ea35-2fcf-4474-b7e9-891529a07a00\") " Dec 01 09:35:20 crc kubenswrapper[4867]: I1201 09:35:20.986216 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de64ea35-2fcf-4474-b7e9-891529a07a00-utilities\") pod \"de64ea35-2fcf-4474-b7e9-891529a07a00\" (UID: \"de64ea35-2fcf-4474-b7e9-891529a07a00\") " Dec 01 09:35:20 crc kubenswrapper[4867]: I1201 09:35:20.986292 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de64ea35-2fcf-4474-b7e9-891529a07a00-catalog-content\") pod \"de64ea35-2fcf-4474-b7e9-891529a07a00\" (UID: \"de64ea35-2fcf-4474-b7e9-891529a07a00\") " Dec 01 09:35:20 crc kubenswrapper[4867]: I1201 09:35:20.987558 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de64ea35-2fcf-4474-b7e9-891529a07a00-utilities" (OuterVolumeSpecName: "utilities") pod "de64ea35-2fcf-4474-b7e9-891529a07a00" (UID: "de64ea35-2fcf-4474-b7e9-891529a07a00"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:35:20 crc kubenswrapper[4867]: I1201 09:35:20.995301 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de64ea35-2fcf-4474-b7e9-891529a07a00-kube-api-access-wkt5v" (OuterVolumeSpecName: "kube-api-access-wkt5v") pod "de64ea35-2fcf-4474-b7e9-891529a07a00" (UID: "de64ea35-2fcf-4474-b7e9-891529a07a00"). InnerVolumeSpecName "kube-api-access-wkt5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:35:21 crc kubenswrapper[4867]: I1201 09:35:21.044729 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de64ea35-2fcf-4474-b7e9-891529a07a00-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de64ea35-2fcf-4474-b7e9-891529a07a00" (UID: "de64ea35-2fcf-4474-b7e9-891529a07a00"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:35:21 crc kubenswrapper[4867]: I1201 09:35:21.089074 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkt5v\" (UniqueName: \"kubernetes.io/projected/de64ea35-2fcf-4474-b7e9-891529a07a00-kube-api-access-wkt5v\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:21 crc kubenswrapper[4867]: I1201 09:35:21.089110 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de64ea35-2fcf-4474-b7e9-891529a07a00-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:21 crc kubenswrapper[4867]: I1201 09:35:21.089120 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de64ea35-2fcf-4474-b7e9-891529a07a00-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:21 crc kubenswrapper[4867]: I1201 09:35:21.441314 4867 generic.go:334] "Generic (PLEG): container finished" podID="de64ea35-2fcf-4474-b7e9-891529a07a00" containerID="bfce55f548f1bc66ec19af2d9fda50690b9addd9eee5aa9270200a2e54805c02" exitCode=0 Dec 01 09:35:21 crc kubenswrapper[4867]: I1201 09:35:21.441386 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hnxjh" Dec 01 09:35:21 crc kubenswrapper[4867]: I1201 09:35:21.441398 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnxjh" event={"ID":"de64ea35-2fcf-4474-b7e9-891529a07a00","Type":"ContainerDied","Data":"bfce55f548f1bc66ec19af2d9fda50690b9addd9eee5aa9270200a2e54805c02"} Dec 01 09:35:21 crc kubenswrapper[4867]: I1201 09:35:21.444070 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnxjh" event={"ID":"de64ea35-2fcf-4474-b7e9-891529a07a00","Type":"ContainerDied","Data":"5dd89787656b4240060a4a906e66f5ff72b9b92286cdeeb9f022da2d3f4959bc"} Dec 01 09:35:21 crc kubenswrapper[4867]: I1201 09:35:21.444146 4867 scope.go:117] "RemoveContainer" containerID="bfce55f548f1bc66ec19af2d9fda50690b9addd9eee5aa9270200a2e54805c02" Dec 01 09:35:21 crc kubenswrapper[4867]: I1201 09:35:21.482927 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hnxjh"] Dec 01 09:35:21 crc kubenswrapper[4867]: I1201 09:35:21.484793 4867 scope.go:117] "RemoveContainer" containerID="950b126b8d7aca809b3599a5c5a37b7e52cf33c73557f7310de5bb3c484418f2" Dec 01 09:35:21 crc kubenswrapper[4867]: I1201 09:35:21.491517 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hnxjh"] Dec 01 09:35:21 crc kubenswrapper[4867]: I1201 09:35:21.509623 4867 scope.go:117] "RemoveContainer" containerID="7d08f0e183bba7e6ee37916e0c93474c3d5ccfb3c4ac66c29953a8659720983e" Dec 01 09:35:21 crc kubenswrapper[4867]: I1201 09:35:21.555475 4867 scope.go:117] "RemoveContainer" containerID="bfce55f548f1bc66ec19af2d9fda50690b9addd9eee5aa9270200a2e54805c02" Dec 01 09:35:21 crc kubenswrapper[4867]: E1201 09:35:21.555857 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfce55f548f1bc66ec19af2d9fda50690b9addd9eee5aa9270200a2e54805c02\": container with ID starting with bfce55f548f1bc66ec19af2d9fda50690b9addd9eee5aa9270200a2e54805c02 not found: ID does not exist" containerID="bfce55f548f1bc66ec19af2d9fda50690b9addd9eee5aa9270200a2e54805c02" Dec 01 09:35:21 crc kubenswrapper[4867]: I1201 09:35:21.555890 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfce55f548f1bc66ec19af2d9fda50690b9addd9eee5aa9270200a2e54805c02"} err="failed to get container status \"bfce55f548f1bc66ec19af2d9fda50690b9addd9eee5aa9270200a2e54805c02\": rpc error: code = NotFound desc = could not find container \"bfce55f548f1bc66ec19af2d9fda50690b9addd9eee5aa9270200a2e54805c02\": container with ID starting with bfce55f548f1bc66ec19af2d9fda50690b9addd9eee5aa9270200a2e54805c02 not found: ID does not exist" Dec 01 09:35:21 crc kubenswrapper[4867]: I1201 09:35:21.555908 4867 scope.go:117] "RemoveContainer" containerID="950b126b8d7aca809b3599a5c5a37b7e52cf33c73557f7310de5bb3c484418f2" Dec 01 09:35:21 crc kubenswrapper[4867]: E1201 09:35:21.556253 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"950b126b8d7aca809b3599a5c5a37b7e52cf33c73557f7310de5bb3c484418f2\": container with ID starting with 950b126b8d7aca809b3599a5c5a37b7e52cf33c73557f7310de5bb3c484418f2 not found: ID does not exist" containerID="950b126b8d7aca809b3599a5c5a37b7e52cf33c73557f7310de5bb3c484418f2" Dec 01 09:35:21 crc kubenswrapper[4867]: I1201 09:35:21.556276 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"950b126b8d7aca809b3599a5c5a37b7e52cf33c73557f7310de5bb3c484418f2"} err="failed to get container status \"950b126b8d7aca809b3599a5c5a37b7e52cf33c73557f7310de5bb3c484418f2\": rpc error: code = NotFound desc = could not find container \"950b126b8d7aca809b3599a5c5a37b7e52cf33c73557f7310de5bb3c484418f2\": container with ID starting with 950b126b8d7aca809b3599a5c5a37b7e52cf33c73557f7310de5bb3c484418f2 not found: ID does not exist" Dec 01 09:35:21 crc kubenswrapper[4867]: I1201 09:35:21.556290 4867 scope.go:117] "RemoveContainer" containerID="7d08f0e183bba7e6ee37916e0c93474c3d5ccfb3c4ac66c29953a8659720983e" Dec 01 09:35:21 crc kubenswrapper[4867]: E1201 09:35:21.556492 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d08f0e183bba7e6ee37916e0c93474c3d5ccfb3c4ac66c29953a8659720983e\": container with ID starting with 7d08f0e183bba7e6ee37916e0c93474c3d5ccfb3c4ac66c29953a8659720983e not found: ID does not exist" containerID="7d08f0e183bba7e6ee37916e0c93474c3d5ccfb3c4ac66c29953a8659720983e" Dec 01 09:35:21 crc kubenswrapper[4867]: I1201 09:35:21.556521 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d08f0e183bba7e6ee37916e0c93474c3d5ccfb3c4ac66c29953a8659720983e"} err="failed to get container status \"7d08f0e183bba7e6ee37916e0c93474c3d5ccfb3c4ac66c29953a8659720983e\": rpc error: code = NotFound desc = could not find container \"7d08f0e183bba7e6ee37916e0c93474c3d5ccfb3c4ac66c29953a8659720983e\": container with ID starting with 7d08f0e183bba7e6ee37916e0c93474c3d5ccfb3c4ac66c29953a8659720983e not found: ID does not exist" Dec 01 09:35:22 crc kubenswrapper[4867]: I1201 09:35:22.837763 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de64ea35-2fcf-4474-b7e9-891529a07a00" path="/var/lib/kubelet/pods/de64ea35-2fcf-4474-b7e9-891529a07a00/volumes" Dec 01 09:35:24 crc kubenswrapper[4867]: I1201 09:35:24.827657 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:35:24 crc kubenswrapper[4867]: E1201 09:35:24.828308 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:35:36 crc kubenswrapper[4867]: I1201 09:35:36.827903 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:35:36 crc kubenswrapper[4867]: E1201 09:35:36.828698 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:35:48 crc kubenswrapper[4867]: I1201 09:35:48.833854 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:35:48 crc kubenswrapper[4867]: E1201 09:35:48.834550 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:36:03 crc kubenswrapper[4867]: I1201 09:36:03.826698 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:36:03 crc kubenswrapper[4867]: E1201 09:36:03.827622 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:36:14 crc kubenswrapper[4867]: I1201 09:36:14.827879 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:36:14 crc kubenswrapper[4867]: E1201 09:36:14.829384 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:36:19 crc kubenswrapper[4867]: I1201 09:36:19.090401 4867 scope.go:117] "RemoveContainer" containerID="34a0c4a81de1bb4129e5d06558eddcb546c909993d7ec0987b5f53321ba11299" Dec 01 09:36:19 crc kubenswrapper[4867]: I1201 09:36:19.117164 4867 scope.go:117] "RemoveContainer" containerID="8163eb01f825678238e1d90d70285bd60f1cbc2337d58fb0372fc95e5b2fd651" Dec 01 09:36:19 crc kubenswrapper[4867]: I1201 09:36:19.134892 4867 scope.go:117] "RemoveContainer" containerID="0c414ebcbdc697cbaed9de55d6ccc1a0159a0be1d4d49a29a38ea5dd5b2ff49a" Dec 01 09:36:19 crc kubenswrapper[4867]: I1201 09:36:19.156028 4867 scope.go:117] "RemoveContainer" containerID="2bcb281e6c5cfdef3c3ac69f23f193b3da184dadb14a85047bd81fa3047fae34" Dec 01 09:36:26 crc kubenswrapper[4867]: I1201 09:36:26.828151 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:36:26 crc kubenswrapper[4867]: E1201 09:36:26.828889 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:36:38 crc kubenswrapper[4867]: I1201 09:36:38.835968 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:36:38 crc kubenswrapper[4867]: E1201 09:36:38.836843 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:36:50 crc kubenswrapper[4867]: I1201 09:36:50.827255 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:36:50 crc kubenswrapper[4867]: E1201 09:36:50.828108 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:37:05 crc kubenswrapper[4867]: I1201 09:37:05.827602 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:37:05 crc kubenswrapper[4867]: E1201 09:37:05.828608 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:37:11 crc kubenswrapper[4867]: I1201 09:37:11.137543 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-609c-account-create-update-8rb7t"] Dec 01 09:37:11 crc kubenswrapper[4867]: I1201 09:37:11.148157 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-776fv"] Dec 01 09:37:11 crc kubenswrapper[4867]: I1201 09:37:11.160481 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-609c-account-create-update-8rb7t"] Dec 01 09:37:11 crc kubenswrapper[4867]: I1201 09:37:11.168935 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-776fv"] Dec 01 09:37:12 crc kubenswrapper[4867]: I1201 09:37:12.839488 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64b3a692-2503-4619-8b20-090eefce0fa5" path="/var/lib/kubelet/pods/64b3a692-2503-4619-8b20-090eefce0fa5/volumes" Dec 01 09:37:12 crc kubenswrapper[4867]: I1201 09:37:12.840722 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d49ca85-8824-4830-b123-56cc15703c4a" path="/var/lib/kubelet/pods/8d49ca85-8824-4830-b123-56cc15703c4a/volumes" Dec 01 09:37:15 crc kubenswrapper[4867]: I1201 09:37:15.040804 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-vhjsc"] Dec 01 09:37:15 crc kubenswrapper[4867]: I1201 09:37:15.051465 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-vhjsc"] Dec 01 09:37:16 crc kubenswrapper[4867]: I1201 09:37:16.029641 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5e76-account-create-update-8gv64"] Dec 01 09:37:16 crc kubenswrapper[4867]: I1201 09:37:16.039942 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c9d7-account-create-update-sqvdl"] Dec 01 09:37:16 crc kubenswrapper[4867]: I1201 09:37:16.050721 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-vnj5m"] Dec 01 09:37:16 crc kubenswrapper[4867]: I1201 09:37:16.061668 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5e76-account-create-update-8gv64"] Dec 01 09:37:16 crc kubenswrapper[4867]: I1201 09:37:16.071828 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c9d7-account-create-update-sqvdl"] Dec 01 09:37:16 crc kubenswrapper[4867]: I1201 09:37:16.080836 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-vnj5m"] Dec 01 09:37:16 crc kubenswrapper[4867]: I1201 09:37:16.838227 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10286952-3989-4e1b-ab98-2971420319da" path="/var/lib/kubelet/pods/10286952-3989-4e1b-ab98-2971420319da/volumes" Dec 01 09:37:16 crc kubenswrapper[4867]: I1201 09:37:16.838844 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ec5a60c-5b6d-49b0-b34c-f61df33220a5" path="/var/lib/kubelet/pods/2ec5a60c-5b6d-49b0-b34c-f61df33220a5/volumes" Dec 01 09:37:16 crc kubenswrapper[4867]: I1201 09:37:16.839461 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3baefa6f-f6ea-43ce-978c-dcd5be45de35" path="/var/lib/kubelet/pods/3baefa6f-f6ea-43ce-978c-dcd5be45de35/volumes" Dec 01 09:37:16 crc kubenswrapper[4867]: I1201 09:37:16.840112 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8228e65b-ce56-48da-b9c5-770632f03a1c" path="/var/lib/kubelet/pods/8228e65b-ce56-48da-b9c5-770632f03a1c/volumes" Dec 01 09:37:18 crc kubenswrapper[4867]: I1201 09:37:18.841921 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:37:18 crc kubenswrapper[4867]: E1201 09:37:18.842551 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:37:19 crc kubenswrapper[4867]: I1201 09:37:19.223803 4867 scope.go:117] "RemoveContainer" containerID="53ca0f9e759b18b286bb20502ca8495c896f1c281c3f8441f2ad1262de6d2f4a" Dec 01 09:37:19 crc kubenswrapper[4867]: I1201 09:37:19.265217 4867 scope.go:117] "RemoveContainer" containerID="ab340cc80d79466b8050c6d7e32f02329c6c3f3e4e47a95d100d8acd09266503" Dec 01 09:37:19 crc kubenswrapper[4867]: I1201 09:37:19.326022 4867 scope.go:117] "RemoveContainer" containerID="32bdec604d4bbcd7757a02fe3f148ffa88f5409b101ca86636d66c77ebcaf824" Dec 01 09:37:19 crc kubenswrapper[4867]: I1201 09:37:19.361941 4867 scope.go:117] "RemoveContainer" containerID="1c0bea86ba295a268d5b3df1c30112bbfb69bc2d940da45279664813da345de8" Dec 01 09:37:19 crc kubenswrapper[4867]: I1201 09:37:19.406807 4867 scope.go:117] "RemoveContainer" containerID="17cfad9e0f0952326b47e5b9fa4203f91932baf87f4b928c74c5cf60866d1760" Dec 01 09:37:19 crc kubenswrapper[4867]: I1201 09:37:19.446402 4867 scope.go:117] "RemoveContainer" containerID="862230330f020c1118a30a0d5648a3c649a9f2bb119c0cfa9cda5b932d73be92" Dec 01 09:37:32 crc kubenswrapper[4867]: I1201 09:37:32.826974 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:37:32 crc kubenswrapper[4867]: E1201 09:37:32.827726 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:37:40 crc kubenswrapper[4867]: I1201 09:37:40.046658 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-jx8tw"] Dec 01 09:37:40 crc kubenswrapper[4867]: I1201 09:37:40.057097 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-jx8tw"] Dec 01 09:37:40 crc kubenswrapper[4867]: I1201 09:37:40.844899 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1964493b-eb78-487f-8210-3f6323e55583" path="/var/lib/kubelet/pods/1964493b-eb78-487f-8210-3f6323e55583/volumes" Dec 01 09:37:44 crc kubenswrapper[4867]: I1201 09:37:44.827172 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:37:44 crc kubenswrapper[4867]: E1201 09:37:44.827860 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:37:52 crc kubenswrapper[4867]: I1201 09:37:52.040076 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-99g2f"] Dec 01 09:37:52 crc kubenswrapper[4867]: I1201 09:37:52.048685 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-99g2f"] Dec 01 09:37:52 crc kubenswrapper[4867]: I1201 09:37:52.843296 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fedf8a1e-7645-4e0c-800a-e551181e5781" path="/var/lib/kubelet/pods/fedf8a1e-7645-4e0c-800a-e551181e5781/volumes" Dec 01 09:37:53 crc kubenswrapper[4867]: I1201 09:37:53.027054 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-h82jj"] Dec 01 09:37:53 crc kubenswrapper[4867]: I1201 09:37:53.034519 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-d209-account-create-update-b55jg"] Dec 01 09:37:53 crc kubenswrapper[4867]: I1201 09:37:53.041955 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bf6f-account-create-update-87lj2"] Dec 01 09:37:53 crc kubenswrapper[4867]: I1201 09:37:53.049048 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-d209-account-create-update-b55jg"] Dec 01 09:37:53 crc kubenswrapper[4867]: I1201 09:37:53.057632 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-h82jj"] Dec 01 09:37:53 crc kubenswrapper[4867]: I1201 09:37:53.064507 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-bf6f-account-create-update-87lj2"] Dec 01 09:37:54 crc kubenswrapper[4867]: I1201 09:37:54.030906 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-s59tq"] Dec 01 09:37:54 crc kubenswrapper[4867]: I1201 09:37:54.038598 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-92d3-account-create-update-4ttg6"] Dec 01 09:37:54 crc kubenswrapper[4867]: I1201 09:37:54.047470 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-92d3-account-create-update-4ttg6"] Dec 01 09:37:54 crc kubenswrapper[4867]: I1201 09:37:54.054425 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-s59tq"] Dec 01 09:37:54 crc kubenswrapper[4867]: I1201 09:37:54.839272 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a97c7f8-1f59-4d4b-b689-7e5de839d1b9" path="/var/lib/kubelet/pods/2a97c7f8-1f59-4d4b-b689-7e5de839d1b9/volumes" Dec 01 09:37:54 crc kubenswrapper[4867]: I1201 09:37:54.841523 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fe79080-dd3d-45d8-9929-030bb4eb72c3" path="/var/lib/kubelet/pods/8fe79080-dd3d-45d8-9929-030bb4eb72c3/volumes" Dec 01 09:37:54 crc kubenswrapper[4867]: I1201 09:37:54.842218 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90966ae5-9855-4e64-bebc-fc216f56de50" path="/var/lib/kubelet/pods/90966ae5-9855-4e64-bebc-fc216f56de50/volumes" Dec 01 09:37:54 crc kubenswrapper[4867]: I1201 09:37:54.843368 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbd0b6f8-d458-42e6-a07a-ba22d371037d" path="/var/lib/kubelet/pods/bbd0b6f8-d458-42e6-a07a-ba22d371037d/volumes" Dec 01 09:37:54 crc kubenswrapper[4867]: I1201 09:37:54.843982 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda04dfd-8163-458a-baa4-df9622a4f5c6" path="/var/lib/kubelet/pods/fda04dfd-8163-458a-baa4-df9622a4f5c6/volumes" Dec 01 09:37:59 crc kubenswrapper[4867]: I1201 09:37:59.826922 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:37:59 crc kubenswrapper[4867]: E1201 09:37:59.827838 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:38:01 crc kubenswrapper[4867]: I1201 09:38:01.052997 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-2vbhv"] Dec 01 09:38:01 crc kubenswrapper[4867]: I1201 09:38:01.060761 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-2vbhv"] Dec 01 09:38:02 crc kubenswrapper[4867]: I1201 09:38:02.838964 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39cdb2ad-9d97-4f37-90e4-a41f554c8755" path="/var/lib/kubelet/pods/39cdb2ad-9d97-4f37-90e4-a41f554c8755/volumes" Dec 01 09:38:12 crc kubenswrapper[4867]: I1201 09:38:12.827253 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:38:12 crc kubenswrapper[4867]: E1201 09:38:12.828063 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:38:14 crc kubenswrapper[4867]: I1201 09:38:14.160829 4867 generic.go:334] "Generic (PLEG): container finished" podID="a32a973f-6473-444b-a71a-d848773d8de2" containerID="5203394fff396f35fb28c482a17a0cc0391c06ec5d09b4bcb5343d5a29b7f39a" exitCode=0 Dec 01 09:38:14 crc kubenswrapper[4867]: I1201 09:38:14.160892 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" event={"ID":"a32a973f-6473-444b-a71a-d848773d8de2","Type":"ContainerDied","Data":"5203394fff396f35fb28c482a17a0cc0391c06ec5d09b4bcb5343d5a29b7f39a"} Dec 01 09:38:15 crc kubenswrapper[4867]: I1201 09:38:15.762759 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" Dec 01 09:38:15 crc kubenswrapper[4867]: I1201 09:38:15.925419 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkfbj\" (UniqueName: \"kubernetes.io/projected/a32a973f-6473-444b-a71a-d848773d8de2-kube-api-access-jkfbj\") pod \"a32a973f-6473-444b-a71a-d848773d8de2\" (UID: \"a32a973f-6473-444b-a71a-d848773d8de2\") " Dec 01 09:38:15 crc kubenswrapper[4867]: I1201 09:38:15.925528 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a32a973f-6473-444b-a71a-d848773d8de2-inventory\") pod \"a32a973f-6473-444b-a71a-d848773d8de2\" (UID: \"a32a973f-6473-444b-a71a-d848773d8de2\") " Dec 01 09:38:15 crc kubenswrapper[4867]: I1201 09:38:15.925699 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a32a973f-6473-444b-a71a-d848773d8de2-ssh-key\") pod \"a32a973f-6473-444b-a71a-d848773d8de2\" (UID: \"a32a973f-6473-444b-a71a-d848773d8de2\") " Dec 01 09:38:15 crc kubenswrapper[4867]: I1201 09:38:15.925828 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a32a973f-6473-444b-a71a-d848773d8de2-bootstrap-combined-ca-bundle\") pod \"a32a973f-6473-444b-a71a-d848773d8de2\" (UID: \"a32a973f-6473-444b-a71a-d848773d8de2\") " Dec 01 09:38:15 crc kubenswrapper[4867]: I1201 09:38:15.933156 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a32a973f-6473-444b-a71a-d848773d8de2-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a32a973f-6473-444b-a71a-d848773d8de2" (UID: "a32a973f-6473-444b-a71a-d848773d8de2"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:38:15 crc kubenswrapper[4867]: I1201 09:38:15.933488 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a32a973f-6473-444b-a71a-d848773d8de2-kube-api-access-jkfbj" (OuterVolumeSpecName: "kube-api-access-jkfbj") pod "a32a973f-6473-444b-a71a-d848773d8de2" (UID: "a32a973f-6473-444b-a71a-d848773d8de2"). InnerVolumeSpecName "kube-api-access-jkfbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:38:15 crc kubenswrapper[4867]: I1201 09:38:15.959155 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a32a973f-6473-444b-a71a-d848773d8de2-inventory" (OuterVolumeSpecName: "inventory") pod "a32a973f-6473-444b-a71a-d848773d8de2" (UID: "a32a973f-6473-444b-a71a-d848773d8de2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:38:15 crc kubenswrapper[4867]: I1201 09:38:15.971725 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a32a973f-6473-444b-a71a-d848773d8de2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a32a973f-6473-444b-a71a-d848773d8de2" (UID: "a32a973f-6473-444b-a71a-d848773d8de2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.027797 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkfbj\" (UniqueName: \"kubernetes.io/projected/a32a973f-6473-444b-a71a-d848773d8de2-kube-api-access-jkfbj\") on node \"crc\" DevicePath \"\"" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.028073 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a32a973f-6473-444b-a71a-d848773d8de2-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.028083 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a32a973f-6473-444b-a71a-d848773d8de2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.028095 4867 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a32a973f-6473-444b-a71a-d848773d8de2-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.177569 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" event={"ID":"a32a973f-6473-444b-a71a-d848773d8de2","Type":"ContainerDied","Data":"f906d35c0dbc29840a7396d102338fb0a7257f09bd107f588cb601cd08268fe6"} Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.177612 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f906d35c0dbc29840a7396d102338fb0a7257f09bd107f588cb601cd08268fe6" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.177660 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.260503 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws"] Dec 01 09:38:16 crc kubenswrapper[4867]: E1201 09:38:16.261028 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de64ea35-2fcf-4474-b7e9-891529a07a00" containerName="extract-utilities" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.261049 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="de64ea35-2fcf-4474-b7e9-891529a07a00" containerName="extract-utilities" Dec 01 09:38:16 crc kubenswrapper[4867]: E1201 09:38:16.261061 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de64ea35-2fcf-4474-b7e9-891529a07a00" containerName="extract-content" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.261070 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="de64ea35-2fcf-4474-b7e9-891529a07a00" containerName="extract-content" Dec 01 09:38:16 crc kubenswrapper[4867]: E1201 09:38:16.261101 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de64ea35-2fcf-4474-b7e9-891529a07a00" containerName="registry-server" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.261108 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="de64ea35-2fcf-4474-b7e9-891529a07a00" containerName="registry-server" Dec 01 09:38:16 crc kubenswrapper[4867]: E1201 09:38:16.261124 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a32a973f-6473-444b-a71a-d848773d8de2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.261134 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a32a973f-6473-444b-a71a-d848773d8de2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.261375 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="de64ea35-2fcf-4474-b7e9-891529a07a00" containerName="registry-server" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.261390 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a32a973f-6473-444b-a71a-d848773d8de2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.262222 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.273073 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.273099 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.273354 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zvcpg" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.273696 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.277447 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws"] Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.334972 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8gzt\" (UniqueName: \"kubernetes.io/projected/eb0d277f-4c89-46d6-8e05-e72c291e30cc-kube-api-access-w8gzt\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws\" (UID: \"eb0d277f-4c89-46d6-8e05-e72c291e30cc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.335106 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb0d277f-4c89-46d6-8e05-e72c291e30cc-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws\" (UID: \"eb0d277f-4c89-46d6-8e05-e72c291e30cc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.335167 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb0d277f-4c89-46d6-8e05-e72c291e30cc-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws\" (UID: \"eb0d277f-4c89-46d6-8e05-e72c291e30cc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.437291 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb0d277f-4c89-46d6-8e05-e72c291e30cc-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws\" (UID: \"eb0d277f-4c89-46d6-8e05-e72c291e30cc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.437466 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb0d277f-4c89-46d6-8e05-e72c291e30cc-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws\" (UID: \"eb0d277f-4c89-46d6-8e05-e72c291e30cc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.437618 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8gzt\" (UniqueName: \"kubernetes.io/projected/eb0d277f-4c89-46d6-8e05-e72c291e30cc-kube-api-access-w8gzt\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws\" (UID: \"eb0d277f-4c89-46d6-8e05-e72c291e30cc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.441126 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb0d277f-4c89-46d6-8e05-e72c291e30cc-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws\" (UID: \"eb0d277f-4c89-46d6-8e05-e72c291e30cc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.441473 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb0d277f-4c89-46d6-8e05-e72c291e30cc-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws\" (UID: \"eb0d277f-4c89-46d6-8e05-e72c291e30cc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.462797 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8gzt\" (UniqueName: \"kubernetes.io/projected/eb0d277f-4c89-46d6-8e05-e72c291e30cc-kube-api-access-w8gzt\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws\" (UID: \"eb0d277f-4c89-46d6-8e05-e72c291e30cc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws" Dec 01 09:38:16 crc kubenswrapper[4867]: I1201 09:38:16.596011 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws" Dec 01 09:38:17 crc kubenswrapper[4867]: I1201 09:38:17.193337 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws"] Dec 01 09:38:17 crc kubenswrapper[4867]: I1201 09:38:17.199676 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:38:18 crc kubenswrapper[4867]: I1201 09:38:18.233301 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws" event={"ID":"eb0d277f-4c89-46d6-8e05-e72c291e30cc","Type":"ContainerStarted","Data":"76dfb0c117f8f13e83c3ff152df86649200b254ce934fb2f52ea37aeaf253c4f"} Dec 01 09:38:18 crc kubenswrapper[4867]: I1201 09:38:18.234631 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws" event={"ID":"eb0d277f-4c89-46d6-8e05-e72c291e30cc","Type":"ContainerStarted","Data":"43e040277db6af09e3ca0ada3ab14983f9eb45ee609cd373dd9ec2daa868e91f"} Dec 01 09:38:18 crc kubenswrapper[4867]: I1201 09:38:18.251952 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws" podStartSLOduration=2.070929223 podStartE2EDuration="2.251928674s" podCreationTimestamp="2025-12-01 09:38:16 +0000 UTC" firstStartedPulling="2025-12-01 09:38:17.199447014 +0000 UTC m=+1818.658833768" lastFinishedPulling="2025-12-01 09:38:17.380446465 +0000 UTC m=+1818.839833219" observedRunningTime="2025-12-01 09:38:18.247564263 +0000 UTC m=+1819.706951017" watchObservedRunningTime="2025-12-01 09:38:18.251928674 +0000 UTC m=+1819.711315438" Dec 01 09:38:19 crc kubenswrapper[4867]: I1201 09:38:19.610789 4867 scope.go:117] "RemoveContainer" containerID="9ad51f9fd9a76b6958eab2b67b127be3f2bf3548e3ddf9051911fa9aafdcb204" Dec 01 09:38:19 crc kubenswrapper[4867]: I1201 09:38:19.631237 4867 scope.go:117] "RemoveContainer" containerID="b90db1a3d03ded2af337174bb31955a2d8ef7554ab85f2979bb089eecae73e82" Dec 01 09:38:19 crc kubenswrapper[4867]: I1201 09:38:19.654077 4867 scope.go:117] "RemoveContainer" containerID="6bc8c6f1329022f1c5a544e78e2955000f239bcac76f4222378fe9d25b355391" Dec 01 09:38:19 crc kubenswrapper[4867]: I1201 09:38:19.725337 4867 scope.go:117] "RemoveContainer" containerID="0c5f1fce49b7d235eb2c663c45e8502c1355e005f213767a98e8fb6b6fb551dd" Dec 01 09:38:19 crc kubenswrapper[4867]: I1201 09:38:19.757173 4867 scope.go:117] "RemoveContainer" containerID="f0bc2379a1b2bab52c22ad998eeca13270df75d0a8c16184cb6d038832e7e045" Dec 01 09:38:19 crc kubenswrapper[4867]: I1201 09:38:19.799995 4867 scope.go:117] "RemoveContainer" containerID="84915af6eafda3e2a49022b05dff200a1e8c6a021bcbf78d3b23d28bf426c760" Dec 01 09:38:19 crc kubenswrapper[4867]: I1201 09:38:19.825926 4867 scope.go:117] "RemoveContainer" containerID="3fae2292faff5ca2946cc92ca9bc42d45b109d75c6dd47e4adaedb1369efaf7b" Dec 01 09:38:19 crc kubenswrapper[4867]: I1201 09:38:19.852381 4867 scope.go:117] "RemoveContainer" containerID="94ef95f022027bc8b0d1c2914b140376701eae047cbfc7d292d4d8d23ffc4c76" Dec 01 09:38:19 crc kubenswrapper[4867]: I1201 09:38:19.891882 4867 scope.go:117] "RemoveContainer" containerID="09424ec7d4f890da08bd6c027064bea9c0b869c3719407b0535305d6868ae74f" Dec 01 09:38:19 crc kubenswrapper[4867]: I1201 09:38:19.916697 4867 scope.go:117] "RemoveContainer" containerID="8541da6d714fb4be5f095bda500a278e17d2bb50ae3359126b8837171fb7851b" Dec 01 09:38:19 crc kubenswrapper[4867]: I1201 09:38:19.966484 4867 scope.go:117] "RemoveContainer" containerID="bc4c053fa3322961c6a417677e4fa3e7640e231458021436ad7bfc1802013052" Dec 01 09:38:19 crc kubenswrapper[4867]: I1201 09:38:19.986663 4867 scope.go:117] "RemoveContainer" containerID="4a74d24d68d5d840b8cb2237dbb0e3b14d9428896889eef774977cac0fd7e82c" Dec 01 09:38:20 crc kubenswrapper[4867]: I1201 09:38:20.009007 4867 scope.go:117] "RemoveContainer" containerID="556309ab756db3ad47a82037ddad0b526638f0aa86226cd781b2196ad0bfc152" Dec 01 09:38:20 crc kubenswrapper[4867]: I1201 09:38:20.028417 4867 scope.go:117] "RemoveContainer" containerID="29a54cec9fb8cfd42529134bc3244a9cd16da33451aba1da17eabdb5d5387c87" Dec 01 09:38:20 crc kubenswrapper[4867]: I1201 09:38:20.056090 4867 scope.go:117] "RemoveContainer" containerID="14a3f72edb9ee9270095c0c2c342697f4e91a64667ae6a96d0cdf7f28921ec3a" Dec 01 09:38:20 crc kubenswrapper[4867]: I1201 09:38:20.087149 4867 scope.go:117] "RemoveContainer" containerID="f3ef8a0c20261aa0b9cd0cfc5dd04d9f041ba12261facb14043e30a41c0a4788" Dec 01 09:38:24 crc kubenswrapper[4867]: I1201 09:38:24.827845 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:38:24 crc kubenswrapper[4867]: E1201 09:38:24.829572 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:38:39 crc kubenswrapper[4867]: I1201 09:38:39.827909 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:38:39 crc kubenswrapper[4867]: E1201 09:38:39.828539 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:38:50 crc kubenswrapper[4867]: I1201 09:38:50.042478 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-s52lq"] Dec 01 09:38:50 crc kubenswrapper[4867]: I1201 09:38:50.052779 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-s52lq"] Dec 01 09:38:50 crc kubenswrapper[4867]: I1201 09:38:50.841920 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0894df6-2174-4d0a-9e26-93650fd0e925" path="/var/lib/kubelet/pods/b0894df6-2174-4d0a-9e26-93650fd0e925/volumes" Dec 01 09:38:52 crc kubenswrapper[4867]: I1201 09:38:52.827311 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:38:53 crc kubenswrapper[4867]: I1201 09:38:53.031052 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-phzxd"] Dec 01 09:38:53 crc kubenswrapper[4867]: I1201 09:38:53.040993 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-phzxd"] Dec 01 09:38:53 crc kubenswrapper[4867]: I1201 09:38:53.572424 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"bd7e219b0f0fd8d4af12e79f8e7905ebebc9a75776aa03953e5ecb9bd9c9bd42"} Dec 01 09:38:54 crc kubenswrapper[4867]: I1201 09:38:54.839343 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb0c725b-663d-4764-b156-9426923ce046" path="/var/lib/kubelet/pods/fb0c725b-663d-4764-b156-9426923ce046/volumes" Dec 01 09:39:03 crc kubenswrapper[4867]: I1201 09:39:03.048893 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-dq766"] Dec 01 09:39:03 crc kubenswrapper[4867]: I1201 09:39:03.080189 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-dq766"] Dec 01 09:39:04 crc kubenswrapper[4867]: I1201 09:39:04.859556 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdcd8107-dd0c-494b-b6ee-93fc8f3d6933" path="/var/lib/kubelet/pods/bdcd8107-dd0c-494b-b6ee-93fc8f3d6933/volumes" Dec 01 09:39:15 crc kubenswrapper[4867]: I1201 09:39:15.037708 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-56qbp"] Dec 01 09:39:15 crc kubenswrapper[4867]: I1201 09:39:15.048047 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-56qbp"] Dec 01 09:39:16 crc kubenswrapper[4867]: I1201 09:39:16.837205 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a891c34b-01dc-4e65-ad1d-b21597555988" path="/var/lib/kubelet/pods/a891c34b-01dc-4e65-ad1d-b21597555988/volumes" Dec 01 09:39:18 crc kubenswrapper[4867]: I1201 09:39:18.025958 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-xmtc6"] Dec 01 09:39:18 crc kubenswrapper[4867]: I1201 09:39:18.035529 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-xmtc6"] Dec 01 09:39:18 crc kubenswrapper[4867]: I1201 09:39:18.841772 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65b95ca9-4891-4e69-a789-a21549f94247" path="/var/lib/kubelet/pods/65b95ca9-4891-4e69-a789-a21549f94247/volumes" Dec 01 09:39:20 crc kubenswrapper[4867]: I1201 09:39:20.374506 4867 scope.go:117] "RemoveContainer" containerID="fc37c5defc43343438e3ca09a00cfa564140074fed6fe82681cda085e1796b7f" Dec 01 09:39:20 crc kubenswrapper[4867]: I1201 09:39:20.405548 4867 scope.go:117] "RemoveContainer" containerID="b3aafaeb547edff8fcedb18f00fdc75d9dcd19277509071c41c71c261000a533" Dec 01 09:39:20 crc kubenswrapper[4867]: I1201 09:39:20.450874 4867 scope.go:117] "RemoveContainer" containerID="e4f0fd8d89b9a3460463a367a8cb01bf7cfe3e0464b1931cd24446d8df94d90b" Dec 01 09:39:20 crc kubenswrapper[4867]: I1201 09:39:20.517747 4867 scope.go:117] "RemoveContainer" containerID="3dde5d99cc2da2d38dca5a3796b9183ab71d8574bd8d931f6f284fcbd5788fac" Dec 01 09:39:20 crc kubenswrapper[4867]: I1201 09:39:20.554084 4867 scope.go:117] "RemoveContainer" containerID="738e1ba32f0935e93690fe3ae7b01f58519ac3f31f26a9814957c63353df460f" Dec 01 09:40:06 crc kubenswrapper[4867]: I1201 09:40:06.044625 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0fcd-account-create-update-pkxwm"] Dec 01 09:40:06 crc kubenswrapper[4867]: I1201 09:40:06.057698 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-kxgsc"] Dec 01 09:40:06 crc kubenswrapper[4867]: I1201 09:40:06.068959 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-qm4ft"] Dec 01 09:40:06 crc kubenswrapper[4867]: I1201 09:40:06.085439 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-kxgsc"] Dec 01 09:40:06 crc kubenswrapper[4867]: I1201 09:40:06.093737 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0fcd-account-create-update-pkxwm"] Dec 01 09:40:06 crc kubenswrapper[4867]: I1201 09:40:06.101685 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-qm4ft"] Dec 01 09:40:06 crc kubenswrapper[4867]: I1201 09:40:06.837991 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37bc51bf-0822-420e-8d4a-5cb236dd83e4" path="/var/lib/kubelet/pods/37bc51bf-0822-420e-8d4a-5cb236dd83e4/volumes" Dec 01 09:40:06 crc kubenswrapper[4867]: I1201 09:40:06.838607 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ea5fe34-452d-4805-8bda-47d8f1ab2381" path="/var/lib/kubelet/pods/5ea5fe34-452d-4805-8bda-47d8f1ab2381/volumes" Dec 01 09:40:06 crc kubenswrapper[4867]: I1201 09:40:06.839192 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adf1dfe6-d703-43e9-9aca-436d6b37c2e9" path="/var/lib/kubelet/pods/adf1dfe6-d703-43e9-9aca-436d6b37c2e9/volumes" Dec 01 09:40:07 crc kubenswrapper[4867]: I1201 09:40:07.035185 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8d9d-account-create-update-q8sjd"] Dec 01 09:40:07 crc kubenswrapper[4867]: I1201 09:40:07.046166 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-8d9d-account-create-update-q8sjd"] Dec 01 09:40:08 crc kubenswrapper[4867]: I1201 09:40:08.042483 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-kwrll"] Dec 01 09:40:08 crc kubenswrapper[4867]: I1201 09:40:08.058608 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-kwrll"] Dec 01 09:40:08 crc kubenswrapper[4867]: I1201 09:40:08.838585 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89cd7bb8-e306-4d6d-96a9-33e00ed9f194" path="/var/lib/kubelet/pods/89cd7bb8-e306-4d6d-96a9-33e00ed9f194/volumes" Dec 01 09:40:08 crc kubenswrapper[4867]: I1201 09:40:08.839575 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8f85ac5-7d71-4b5f-ae85-67b245b18c18" path="/var/lib/kubelet/pods/a8f85ac5-7d71-4b5f-ae85-67b245b18c18/volumes" Dec 01 09:40:09 crc kubenswrapper[4867]: I1201 09:40:09.026223 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-c564-account-create-update-h6vqd"] Dec 01 09:40:09 crc kubenswrapper[4867]: I1201 09:40:09.035763 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-c564-account-create-update-h6vqd"] Dec 01 09:40:10 crc kubenswrapper[4867]: I1201 09:40:10.837372 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e5461b6-70c3-4b0a-aea5-827baa9fc665" path="/var/lib/kubelet/pods/3e5461b6-70c3-4b0a-aea5-827baa9fc665/volumes" Dec 01 09:40:20 crc kubenswrapper[4867]: I1201 09:40:20.692400 4867 scope.go:117] "RemoveContainer" containerID="8380c6624ae4ca667b01d7fcfec3f1549121f09e65da55860522386cd182e428" Dec 01 09:40:20 crc kubenswrapper[4867]: I1201 09:40:20.729453 4867 scope.go:117] "RemoveContainer" containerID="e30b2dd307779de6d0c69d5daa297da446630ce11e847904b81d1e9857a6e808" Dec 01 09:40:20 crc kubenswrapper[4867]: I1201 09:40:20.778583 4867 scope.go:117] "RemoveContainer" containerID="e34023d4caf0c4f8960e4d119265440dcb0a7191944d841d1b3c3cf1054cf5b6" Dec 01 09:40:20 crc kubenswrapper[4867]: I1201 09:40:20.820570 4867 scope.go:117] "RemoveContainer" containerID="b44a345ee62e34ad61416dcfff0a0fa9014134312601d92b46bc5a5faa525f2f" Dec 01 09:40:20 crc kubenswrapper[4867]: I1201 09:40:20.873383 4867 scope.go:117] "RemoveContainer" containerID="17574e05985b2b4766ede465929a62c0017d755e96dc17eb9460add4f9de65cf" Dec 01 09:40:20 crc kubenswrapper[4867]: I1201 09:40:20.906775 4867 scope.go:117] "RemoveContainer" containerID="c4b319bdd9890133e51e8a855985e834ffdc22314a0bfd4357c58ac42b61dd7f" Dec 01 09:40:26 crc kubenswrapper[4867]: I1201 09:40:26.390996 4867 generic.go:334] "Generic (PLEG): container finished" podID="eb0d277f-4c89-46d6-8e05-e72c291e30cc" containerID="76dfb0c117f8f13e83c3ff152df86649200b254ce934fb2f52ea37aeaf253c4f" exitCode=0 Dec 01 09:40:26 crc kubenswrapper[4867]: I1201 09:40:26.391067 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws" event={"ID":"eb0d277f-4c89-46d6-8e05-e72c291e30cc","Type":"ContainerDied","Data":"76dfb0c117f8f13e83c3ff152df86649200b254ce934fb2f52ea37aeaf253c4f"} Dec 01 09:40:27 crc kubenswrapper[4867]: I1201 09:40:27.837638 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws" Dec 01 09:40:27 crc kubenswrapper[4867]: I1201 09:40:27.902240 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb0d277f-4c89-46d6-8e05-e72c291e30cc-ssh-key\") pod \"eb0d277f-4c89-46d6-8e05-e72c291e30cc\" (UID: \"eb0d277f-4c89-46d6-8e05-e72c291e30cc\") " Dec 01 09:40:27 crc kubenswrapper[4867]: I1201 09:40:27.902627 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8gzt\" (UniqueName: \"kubernetes.io/projected/eb0d277f-4c89-46d6-8e05-e72c291e30cc-kube-api-access-w8gzt\") pod \"eb0d277f-4c89-46d6-8e05-e72c291e30cc\" (UID: \"eb0d277f-4c89-46d6-8e05-e72c291e30cc\") " Dec 01 09:40:27 crc kubenswrapper[4867]: I1201 09:40:27.902791 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb0d277f-4c89-46d6-8e05-e72c291e30cc-inventory\") pod \"eb0d277f-4c89-46d6-8e05-e72c291e30cc\" (UID: \"eb0d277f-4c89-46d6-8e05-e72c291e30cc\") " Dec 01 09:40:27 crc kubenswrapper[4867]: I1201 09:40:27.924699 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb0d277f-4c89-46d6-8e05-e72c291e30cc-kube-api-access-w8gzt" (OuterVolumeSpecName: "kube-api-access-w8gzt") pod "eb0d277f-4c89-46d6-8e05-e72c291e30cc" (UID: "eb0d277f-4c89-46d6-8e05-e72c291e30cc"). InnerVolumeSpecName "kube-api-access-w8gzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:40:27 crc kubenswrapper[4867]: I1201 09:40:27.940621 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb0d277f-4c89-46d6-8e05-e72c291e30cc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "eb0d277f-4c89-46d6-8e05-e72c291e30cc" (UID: "eb0d277f-4c89-46d6-8e05-e72c291e30cc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:40:27 crc kubenswrapper[4867]: I1201 09:40:27.941247 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb0d277f-4c89-46d6-8e05-e72c291e30cc-inventory" (OuterVolumeSpecName: "inventory") pod "eb0d277f-4c89-46d6-8e05-e72c291e30cc" (UID: "eb0d277f-4c89-46d6-8e05-e72c291e30cc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.006965 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8gzt\" (UniqueName: \"kubernetes.io/projected/eb0d277f-4c89-46d6-8e05-e72c291e30cc-kube-api-access-w8gzt\") on node \"crc\" DevicePath \"\"" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.007032 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb0d277f-4c89-46d6-8e05-e72c291e30cc-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.007044 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb0d277f-4c89-46d6-8e05-e72c291e30cc-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.408892 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws" event={"ID":"eb0d277f-4c89-46d6-8e05-e72c291e30cc","Type":"ContainerDied","Data":"43e040277db6af09e3ca0ada3ab14983f9eb45ee609cd373dd9ec2daa868e91f"} Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.408929 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43e040277db6af09e3ca0ada3ab14983f9eb45ee609cd373dd9ec2daa868e91f" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.408970 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.495114 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb"] Dec 01 09:40:28 crc kubenswrapper[4867]: E1201 09:40:28.495882 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb0d277f-4c89-46d6-8e05-e72c291e30cc" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.495905 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb0d277f-4c89-46d6-8e05-e72c291e30cc" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.496127 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb0d277f-4c89-46d6-8e05-e72c291e30cc" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.496845 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.499047 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.499078 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zvcpg" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.499049 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.501500 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.511257 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb"] Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.617235 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d598d0dc-37e0-47ac-8fcd-597f70f1300b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-44ppb\" (UID: \"d598d0dc-37e0-47ac-8fcd-597f70f1300b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.617289 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvr6m\" (UniqueName: \"kubernetes.io/projected/d598d0dc-37e0-47ac-8fcd-597f70f1300b-kube-api-access-tvr6m\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-44ppb\" (UID: \"d598d0dc-37e0-47ac-8fcd-597f70f1300b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.617357 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d598d0dc-37e0-47ac-8fcd-597f70f1300b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-44ppb\" (UID: \"d598d0dc-37e0-47ac-8fcd-597f70f1300b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.718694 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d598d0dc-37e0-47ac-8fcd-597f70f1300b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-44ppb\" (UID: \"d598d0dc-37e0-47ac-8fcd-597f70f1300b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.718753 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvr6m\" (UniqueName: \"kubernetes.io/projected/d598d0dc-37e0-47ac-8fcd-597f70f1300b-kube-api-access-tvr6m\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-44ppb\" (UID: \"d598d0dc-37e0-47ac-8fcd-597f70f1300b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.718840 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d598d0dc-37e0-47ac-8fcd-597f70f1300b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-44ppb\" (UID: \"d598d0dc-37e0-47ac-8fcd-597f70f1300b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.726482 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d598d0dc-37e0-47ac-8fcd-597f70f1300b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-44ppb\" (UID: \"d598d0dc-37e0-47ac-8fcd-597f70f1300b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.728242 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d598d0dc-37e0-47ac-8fcd-597f70f1300b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-44ppb\" (UID: \"d598d0dc-37e0-47ac-8fcd-597f70f1300b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.745606 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvr6m\" (UniqueName: \"kubernetes.io/projected/d598d0dc-37e0-47ac-8fcd-597f70f1300b-kube-api-access-tvr6m\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-44ppb\" (UID: \"d598d0dc-37e0-47ac-8fcd-597f70f1300b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb" Dec 01 09:40:28 crc kubenswrapper[4867]: I1201 09:40:28.835864 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb" Dec 01 09:40:29 crc kubenswrapper[4867]: I1201 09:40:29.368722 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb"] Dec 01 09:40:29 crc kubenswrapper[4867]: I1201 09:40:29.418648 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb" event={"ID":"d598d0dc-37e0-47ac-8fcd-597f70f1300b","Type":"ContainerStarted","Data":"4b07cd54887335ac2773d32f7f41cd185388688c312566dcce09e07d54cebbc7"} Dec 01 09:40:30 crc kubenswrapper[4867]: I1201 09:40:30.427933 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb" event={"ID":"d598d0dc-37e0-47ac-8fcd-597f70f1300b","Type":"ContainerStarted","Data":"d3f3b4a1912dfe03768a2cb8e3500cd350e3ffe2e87efa2bd2dff123c9ff81b3"} Dec 01 09:40:30 crc kubenswrapper[4867]: I1201 09:40:30.460650 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb" podStartSLOduration=2.2836193160000002 podStartE2EDuration="2.460626111s" podCreationTimestamp="2025-12-01 09:40:28 +0000 UTC" firstStartedPulling="2025-12-01 09:40:29.377095948 +0000 UTC m=+1950.836482702" lastFinishedPulling="2025-12-01 09:40:29.554102743 +0000 UTC m=+1951.013489497" observedRunningTime="2025-12-01 09:40:30.444433155 +0000 UTC m=+1951.903819909" watchObservedRunningTime="2025-12-01 09:40:30.460626111 +0000 UTC m=+1951.920012885" Dec 01 09:41:09 crc kubenswrapper[4867]: I1201 09:41:09.046489 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cdt5g"] Dec 01 09:41:09 crc kubenswrapper[4867]: I1201 09:41:09.057953 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cdt5g"] Dec 01 09:41:10 crc kubenswrapper[4867]: I1201 09:41:10.837667 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e83471d7-4d9d-427c-b769-bd072acbaae0" path="/var/lib/kubelet/pods/e83471d7-4d9d-427c-b769-bd072acbaae0/volumes" Dec 01 09:41:21 crc kubenswrapper[4867]: I1201 09:41:21.051780 4867 scope.go:117] "RemoveContainer" containerID="e2f8e9d387cdafacb10858ccb0a575eab5feec4f6cad38bf28df16f55c143b64" Dec 01 09:41:21 crc kubenswrapper[4867]: I1201 09:41:21.601404 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:41:21 crc kubenswrapper[4867]: I1201 09:41:21.601456 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:41:39 crc kubenswrapper[4867]: I1201 09:41:39.058531 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-md2jq"] Dec 01 09:41:39 crc kubenswrapper[4867]: I1201 09:41:39.068336 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-md2jq"] Dec 01 09:41:40 crc kubenswrapper[4867]: I1201 09:41:40.033673 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-zvrlk"] Dec 01 09:41:40 crc kubenswrapper[4867]: I1201 09:41:40.042876 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-zvrlk"] Dec 01 09:41:40 crc kubenswrapper[4867]: I1201 09:41:40.838859 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aba41ac2-513c-437d-a26c-7b341306bddc" path="/var/lib/kubelet/pods/aba41ac2-513c-437d-a26c-7b341306bddc/volumes" Dec 01 09:41:40 crc kubenswrapper[4867]: I1201 09:41:40.839618 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4cb4881-0b4e-4085-9627-1efc85a5efaa" path="/var/lib/kubelet/pods/d4cb4881-0b4e-4085-9627-1efc85a5efaa/volumes" Dec 01 09:41:51 crc kubenswrapper[4867]: I1201 09:41:51.601893 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:41:51 crc kubenswrapper[4867]: I1201 09:41:51.602359 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:42:21 crc kubenswrapper[4867]: I1201 09:42:21.124463 4867 scope.go:117] "RemoveContainer" containerID="41246256b811696ce11c99c101cb562ee0d889974e71ac8343234377721d4eb6" Dec 01 09:42:21 crc kubenswrapper[4867]: I1201 09:42:21.155419 4867 scope.go:117] "RemoveContainer" containerID="caae03d03e7fecae1a95ba8eb1c57e96e353740e93fb825166e570ab1a4d60b8" Dec 01 09:42:21 crc kubenswrapper[4867]: I1201 09:42:21.601427 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:42:21 crc kubenswrapper[4867]: I1201 09:42:21.601475 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:42:21 crc kubenswrapper[4867]: I1201 09:42:21.601516 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:42:21 crc kubenswrapper[4867]: I1201 09:42:21.602190 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bd7e219b0f0fd8d4af12e79f8e7905ebebc9a75776aa03953e5ecb9bd9c9bd42"} pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:42:21 crc kubenswrapper[4867]: I1201 09:42:21.602240 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" containerID="cri-o://bd7e219b0f0fd8d4af12e79f8e7905ebebc9a75776aa03953e5ecb9bd9c9bd42" gracePeriod=600 Dec 01 09:42:22 crc kubenswrapper[4867]: I1201 09:42:22.448120 4867 generic.go:334] "Generic (PLEG): container finished" podID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerID="bd7e219b0f0fd8d4af12e79f8e7905ebebc9a75776aa03953e5ecb9bd9c9bd42" exitCode=0 Dec 01 09:42:22 crc kubenswrapper[4867]: I1201 09:42:22.448194 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerDied","Data":"bd7e219b0f0fd8d4af12e79f8e7905ebebc9a75776aa03953e5ecb9bd9c9bd42"} Dec 01 09:42:22 crc kubenswrapper[4867]: I1201 09:42:22.448436 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb"} Dec 01 09:42:22 crc kubenswrapper[4867]: I1201 09:42:22.448459 4867 scope.go:117] "RemoveContainer" containerID="5d975c93b55e3a68e0c4cd9682ffba6baed12435e03710f6bc18d2ad99327949" Dec 01 09:42:23 crc kubenswrapper[4867]: I1201 09:42:23.063291 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-vkcwz"] Dec 01 09:42:23 crc kubenswrapper[4867]: I1201 09:42:23.071100 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-vkcwz"] Dec 01 09:42:24 crc kubenswrapper[4867]: I1201 09:42:24.838513 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f49a1bf-d1bd-4027-879c-e5d8b6081396" path="/var/lib/kubelet/pods/2f49a1bf-d1bd-4027-879c-e5d8b6081396/volumes" Dec 01 09:42:26 crc kubenswrapper[4867]: I1201 09:42:26.488675 4867 generic.go:334] "Generic (PLEG): container finished" podID="d598d0dc-37e0-47ac-8fcd-597f70f1300b" containerID="d3f3b4a1912dfe03768a2cb8e3500cd350e3ffe2e87efa2bd2dff123c9ff81b3" exitCode=0 Dec 01 09:42:26 crc kubenswrapper[4867]: I1201 09:42:26.488761 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb" event={"ID":"d598d0dc-37e0-47ac-8fcd-597f70f1300b","Type":"ContainerDied","Data":"d3f3b4a1912dfe03768a2cb8e3500cd350e3ffe2e87efa2bd2dff123c9ff81b3"} Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.003358 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.056163 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d598d0dc-37e0-47ac-8fcd-597f70f1300b-inventory\") pod \"d598d0dc-37e0-47ac-8fcd-597f70f1300b\" (UID: \"d598d0dc-37e0-47ac-8fcd-597f70f1300b\") " Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.056479 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvr6m\" (UniqueName: \"kubernetes.io/projected/d598d0dc-37e0-47ac-8fcd-597f70f1300b-kube-api-access-tvr6m\") pod \"d598d0dc-37e0-47ac-8fcd-597f70f1300b\" (UID: \"d598d0dc-37e0-47ac-8fcd-597f70f1300b\") " Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.056559 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d598d0dc-37e0-47ac-8fcd-597f70f1300b-ssh-key\") pod \"d598d0dc-37e0-47ac-8fcd-597f70f1300b\" (UID: \"d598d0dc-37e0-47ac-8fcd-597f70f1300b\") " Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.076134 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d598d0dc-37e0-47ac-8fcd-597f70f1300b-kube-api-access-tvr6m" (OuterVolumeSpecName: "kube-api-access-tvr6m") pod "d598d0dc-37e0-47ac-8fcd-597f70f1300b" (UID: "d598d0dc-37e0-47ac-8fcd-597f70f1300b"). InnerVolumeSpecName "kube-api-access-tvr6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.104664 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d598d0dc-37e0-47ac-8fcd-597f70f1300b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d598d0dc-37e0-47ac-8fcd-597f70f1300b" (UID: "d598d0dc-37e0-47ac-8fcd-597f70f1300b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.113184 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d598d0dc-37e0-47ac-8fcd-597f70f1300b-inventory" (OuterVolumeSpecName: "inventory") pod "d598d0dc-37e0-47ac-8fcd-597f70f1300b" (UID: "d598d0dc-37e0-47ac-8fcd-597f70f1300b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.158298 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvr6m\" (UniqueName: \"kubernetes.io/projected/d598d0dc-37e0-47ac-8fcd-597f70f1300b-kube-api-access-tvr6m\") on node \"crc\" DevicePath \"\"" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.158335 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d598d0dc-37e0-47ac-8fcd-597f70f1300b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.158349 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d598d0dc-37e0-47ac-8fcd-597f70f1300b-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.523246 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb" event={"ID":"d598d0dc-37e0-47ac-8fcd-597f70f1300b","Type":"ContainerDied","Data":"4b07cd54887335ac2773d32f7f41cd185388688c312566dcce09e07d54cebbc7"} Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.523664 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b07cd54887335ac2773d32f7f41cd185388688c312566dcce09e07d54cebbc7" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.523351 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-44ppb" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.616896 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8"] Dec 01 09:42:28 crc kubenswrapper[4867]: E1201 09:42:28.617491 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d598d0dc-37e0-47ac-8fcd-597f70f1300b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.617517 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="d598d0dc-37e0-47ac-8fcd-597f70f1300b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.617717 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="d598d0dc-37e0-47ac-8fcd-597f70f1300b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.618639 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.621040 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zvcpg" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.621883 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.625570 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.630148 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.636527 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8"] Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.666313 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21f6bea0-2abe-4029-8272-f6da0825cf69-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8\" (UID: \"21f6bea0-2abe-4029-8272-f6da0825cf69\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.666493 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21f6bea0-2abe-4029-8272-f6da0825cf69-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8\" (UID: \"21f6bea0-2abe-4029-8272-f6da0825cf69\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.666697 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv6wv\" (UniqueName: \"kubernetes.io/projected/21f6bea0-2abe-4029-8272-f6da0825cf69-kube-api-access-pv6wv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8\" (UID: \"21f6bea0-2abe-4029-8272-f6da0825cf69\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.769098 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21f6bea0-2abe-4029-8272-f6da0825cf69-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8\" (UID: \"21f6bea0-2abe-4029-8272-f6da0825cf69\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.769294 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv6wv\" (UniqueName: \"kubernetes.io/projected/21f6bea0-2abe-4029-8272-f6da0825cf69-kube-api-access-pv6wv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8\" (UID: \"21f6bea0-2abe-4029-8272-f6da0825cf69\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.769448 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21f6bea0-2abe-4029-8272-f6da0825cf69-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8\" (UID: \"21f6bea0-2abe-4029-8272-f6da0825cf69\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.774667 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21f6bea0-2abe-4029-8272-f6da0825cf69-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8\" (UID: \"21f6bea0-2abe-4029-8272-f6da0825cf69\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.774774 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21f6bea0-2abe-4029-8272-f6da0825cf69-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8\" (UID: \"21f6bea0-2abe-4029-8272-f6da0825cf69\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.786544 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv6wv\" (UniqueName: \"kubernetes.io/projected/21f6bea0-2abe-4029-8272-f6da0825cf69-kube-api-access-pv6wv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8\" (UID: \"21f6bea0-2abe-4029-8272-f6da0825cf69\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8" Dec 01 09:42:28 crc kubenswrapper[4867]: I1201 09:42:28.934589 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8" Dec 01 09:42:29 crc kubenswrapper[4867]: I1201 09:42:29.653254 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8"] Dec 01 09:42:30 crc kubenswrapper[4867]: I1201 09:42:30.538418 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8" event={"ID":"21f6bea0-2abe-4029-8272-f6da0825cf69","Type":"ContainerStarted","Data":"7ad3376d844beba276178bd3615b0ecb797af342a93f2e96b9ff5868cc089346"} Dec 01 09:42:30 crc kubenswrapper[4867]: I1201 09:42:30.538730 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8" event={"ID":"21f6bea0-2abe-4029-8272-f6da0825cf69","Type":"ContainerStarted","Data":"41ecc6c8e44c2f1158601ce86bca96ba46ababca367969451d9ea94782fcd30b"} Dec 01 09:42:30 crc kubenswrapper[4867]: I1201 09:42:30.558347 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8" podStartSLOduration=2.3714959159999998 podStartE2EDuration="2.558329831s" podCreationTimestamp="2025-12-01 09:42:28 +0000 UTC" firstStartedPulling="2025-12-01 09:42:29.672007009 +0000 UTC m=+2071.131393763" lastFinishedPulling="2025-12-01 09:42:29.858840924 +0000 UTC m=+2071.318227678" observedRunningTime="2025-12-01 09:42:30.552304306 +0000 UTC m=+2072.011691060" watchObservedRunningTime="2025-12-01 09:42:30.558329831 +0000 UTC m=+2072.017716585" Dec 01 09:42:35 crc kubenswrapper[4867]: I1201 09:42:35.578992 4867 generic.go:334] "Generic (PLEG): container finished" podID="21f6bea0-2abe-4029-8272-f6da0825cf69" containerID="7ad3376d844beba276178bd3615b0ecb797af342a93f2e96b9ff5868cc089346" exitCode=0 Dec 01 09:42:35 crc kubenswrapper[4867]: I1201 09:42:35.579073 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8" event={"ID":"21f6bea0-2abe-4029-8272-f6da0825cf69","Type":"ContainerDied","Data":"7ad3376d844beba276178bd3615b0ecb797af342a93f2e96b9ff5868cc089346"} Dec 01 09:42:36 crc kubenswrapper[4867]: I1201 09:42:36.970339 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.023001 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21f6bea0-2abe-4029-8272-f6da0825cf69-inventory\") pod \"21f6bea0-2abe-4029-8272-f6da0825cf69\" (UID: \"21f6bea0-2abe-4029-8272-f6da0825cf69\") " Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.024152 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21f6bea0-2abe-4029-8272-f6da0825cf69-ssh-key\") pod \"21f6bea0-2abe-4029-8272-f6da0825cf69\" (UID: \"21f6bea0-2abe-4029-8272-f6da0825cf69\") " Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.024199 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pv6wv\" (UniqueName: \"kubernetes.io/projected/21f6bea0-2abe-4029-8272-f6da0825cf69-kube-api-access-pv6wv\") pod \"21f6bea0-2abe-4029-8272-f6da0825cf69\" (UID: \"21f6bea0-2abe-4029-8272-f6da0825cf69\") " Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.032326 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21f6bea0-2abe-4029-8272-f6da0825cf69-kube-api-access-pv6wv" (OuterVolumeSpecName: "kube-api-access-pv6wv") pod "21f6bea0-2abe-4029-8272-f6da0825cf69" (UID: "21f6bea0-2abe-4029-8272-f6da0825cf69"). InnerVolumeSpecName "kube-api-access-pv6wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.053027 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21f6bea0-2abe-4029-8272-f6da0825cf69-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "21f6bea0-2abe-4029-8272-f6da0825cf69" (UID: "21f6bea0-2abe-4029-8272-f6da0825cf69"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.061410 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21f6bea0-2abe-4029-8272-f6da0825cf69-inventory" (OuterVolumeSpecName: "inventory") pod "21f6bea0-2abe-4029-8272-f6da0825cf69" (UID: "21f6bea0-2abe-4029-8272-f6da0825cf69"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.128536 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21f6bea0-2abe-4029-8272-f6da0825cf69-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.128570 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pv6wv\" (UniqueName: \"kubernetes.io/projected/21f6bea0-2abe-4029-8272-f6da0825cf69-kube-api-access-pv6wv\") on node \"crc\" DevicePath \"\"" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.128583 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21f6bea0-2abe-4029-8272-f6da0825cf69-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.596781 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8" event={"ID":"21f6bea0-2abe-4029-8272-f6da0825cf69","Type":"ContainerDied","Data":"41ecc6c8e44c2f1158601ce86bca96ba46ababca367969451d9ea94782fcd30b"} Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.597148 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41ecc6c8e44c2f1158601ce86bca96ba46ababca367969451d9ea94782fcd30b" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.596874 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.666993 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88"] Dec 01 09:42:37 crc kubenswrapper[4867]: E1201 09:42:37.667568 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f6bea0-2abe-4029-8272-f6da0825cf69" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.667640 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f6bea0-2abe-4029-8272-f6da0825cf69" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.667909 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f6bea0-2abe-4029-8272-f6da0825cf69" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.668582 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.671096 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.671321 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zvcpg" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.671560 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.671757 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.692878 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88"] Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.739612 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnljq\" (UniqueName: \"kubernetes.io/projected/93968ab3-45b8-4b7a-a395-8344714bb9e9-kube-api-access-xnljq\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ktd88\" (UID: \"93968ab3-45b8-4b7a-a395-8344714bb9e9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.739697 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93968ab3-45b8-4b7a-a395-8344714bb9e9-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ktd88\" (UID: \"93968ab3-45b8-4b7a-a395-8344714bb9e9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.739773 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93968ab3-45b8-4b7a-a395-8344714bb9e9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ktd88\" (UID: \"93968ab3-45b8-4b7a-a395-8344714bb9e9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.842966 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93968ab3-45b8-4b7a-a395-8344714bb9e9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ktd88\" (UID: \"93968ab3-45b8-4b7a-a395-8344714bb9e9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.843178 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnljq\" (UniqueName: \"kubernetes.io/projected/93968ab3-45b8-4b7a-a395-8344714bb9e9-kube-api-access-xnljq\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ktd88\" (UID: \"93968ab3-45b8-4b7a-a395-8344714bb9e9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.843231 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93968ab3-45b8-4b7a-a395-8344714bb9e9-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ktd88\" (UID: \"93968ab3-45b8-4b7a-a395-8344714bb9e9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.849645 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93968ab3-45b8-4b7a-a395-8344714bb9e9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ktd88\" (UID: \"93968ab3-45b8-4b7a-a395-8344714bb9e9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.853301 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93968ab3-45b8-4b7a-a395-8344714bb9e9-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ktd88\" (UID: \"93968ab3-45b8-4b7a-a395-8344714bb9e9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.860382 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnljq\" (UniqueName: \"kubernetes.io/projected/93968ab3-45b8-4b7a-a395-8344714bb9e9-kube-api-access-xnljq\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ktd88\" (UID: \"93968ab3-45b8-4b7a-a395-8344714bb9e9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88" Dec 01 09:42:37 crc kubenswrapper[4867]: I1201 09:42:37.989878 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88" Dec 01 09:42:38 crc kubenswrapper[4867]: I1201 09:42:38.529598 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88"] Dec 01 09:42:38 crc kubenswrapper[4867]: I1201 09:42:38.605634 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88" event={"ID":"93968ab3-45b8-4b7a-a395-8344714bb9e9","Type":"ContainerStarted","Data":"55bb0ac9f88b33a8126dd5e291860ef7ed7a0e0fc3d0541d237df2d0ec04f9e0"} Dec 01 09:42:39 crc kubenswrapper[4867]: I1201 09:42:39.614665 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88" event={"ID":"93968ab3-45b8-4b7a-a395-8344714bb9e9","Type":"ContainerStarted","Data":"0fc346c88377b617dc225332606b2c738be277d2a69250f0086688bfe863ecc5"} Dec 01 09:42:39 crc kubenswrapper[4867]: I1201 09:42:39.635091 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88" podStartSLOduration=2.429843453 podStartE2EDuration="2.635069813s" podCreationTimestamp="2025-12-01 09:42:37 +0000 UTC" firstStartedPulling="2025-12-01 09:42:38.536208879 +0000 UTC m=+2079.995595633" lastFinishedPulling="2025-12-01 09:42:38.741435219 +0000 UTC m=+2080.200821993" observedRunningTime="2025-12-01 09:42:39.631872165 +0000 UTC m=+2081.091258939" watchObservedRunningTime="2025-12-01 09:42:39.635069813 +0000 UTC m=+2081.094456567" Dec 01 09:43:21 crc kubenswrapper[4867]: I1201 09:43:21.259679 4867 scope.go:117] "RemoveContainer" containerID="81eac3897506718b344a440218a58d4ea149d62d48ae6d9a6b6ba5946dc88ee9" Dec 01 09:43:25 crc kubenswrapper[4867]: I1201 09:43:25.036410 4867 generic.go:334] "Generic (PLEG): container finished" podID="93968ab3-45b8-4b7a-a395-8344714bb9e9" containerID="0fc346c88377b617dc225332606b2c738be277d2a69250f0086688bfe863ecc5" exitCode=0 Dec 01 09:43:25 crc kubenswrapper[4867]: I1201 09:43:25.036540 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88" event={"ID":"93968ab3-45b8-4b7a-a395-8344714bb9e9","Type":"ContainerDied","Data":"0fc346c88377b617dc225332606b2c738be277d2a69250f0086688bfe863ecc5"} Dec 01 09:43:26 crc kubenswrapper[4867]: I1201 09:43:26.417571 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88" Dec 01 09:43:26 crc kubenswrapper[4867]: I1201 09:43:26.602661 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93968ab3-45b8-4b7a-a395-8344714bb9e9-inventory\") pod \"93968ab3-45b8-4b7a-a395-8344714bb9e9\" (UID: \"93968ab3-45b8-4b7a-a395-8344714bb9e9\") " Dec 01 09:43:26 crc kubenswrapper[4867]: I1201 09:43:26.602919 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnljq\" (UniqueName: \"kubernetes.io/projected/93968ab3-45b8-4b7a-a395-8344714bb9e9-kube-api-access-xnljq\") pod \"93968ab3-45b8-4b7a-a395-8344714bb9e9\" (UID: \"93968ab3-45b8-4b7a-a395-8344714bb9e9\") " Dec 01 09:43:26 crc kubenswrapper[4867]: I1201 09:43:26.602948 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93968ab3-45b8-4b7a-a395-8344714bb9e9-ssh-key\") pod \"93968ab3-45b8-4b7a-a395-8344714bb9e9\" (UID: \"93968ab3-45b8-4b7a-a395-8344714bb9e9\") " Dec 01 09:43:26 crc kubenswrapper[4867]: I1201 09:43:26.609301 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93968ab3-45b8-4b7a-a395-8344714bb9e9-kube-api-access-xnljq" (OuterVolumeSpecName: "kube-api-access-xnljq") pod "93968ab3-45b8-4b7a-a395-8344714bb9e9" (UID: "93968ab3-45b8-4b7a-a395-8344714bb9e9"). InnerVolumeSpecName "kube-api-access-xnljq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:43:26 crc kubenswrapper[4867]: I1201 09:43:26.643184 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93968ab3-45b8-4b7a-a395-8344714bb9e9-inventory" (OuterVolumeSpecName: "inventory") pod "93968ab3-45b8-4b7a-a395-8344714bb9e9" (UID: "93968ab3-45b8-4b7a-a395-8344714bb9e9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:43:26 crc kubenswrapper[4867]: I1201 09:43:26.646896 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93968ab3-45b8-4b7a-a395-8344714bb9e9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "93968ab3-45b8-4b7a-a395-8344714bb9e9" (UID: "93968ab3-45b8-4b7a-a395-8344714bb9e9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:43:26 crc kubenswrapper[4867]: I1201 09:43:26.705796 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnljq\" (UniqueName: \"kubernetes.io/projected/93968ab3-45b8-4b7a-a395-8344714bb9e9-kube-api-access-xnljq\") on node \"crc\" DevicePath \"\"" Dec 01 09:43:26 crc kubenswrapper[4867]: I1201 09:43:26.705865 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93968ab3-45b8-4b7a-a395-8344714bb9e9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:43:26 crc kubenswrapper[4867]: I1201 09:43:26.705875 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93968ab3-45b8-4b7a-a395-8344714bb9e9-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.056112 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88" event={"ID":"93968ab3-45b8-4b7a-a395-8344714bb9e9","Type":"ContainerDied","Data":"55bb0ac9f88b33a8126dd5e291860ef7ed7a0e0fc3d0541d237df2d0ec04f9e0"} Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.056553 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55bb0ac9f88b33a8126dd5e291860ef7ed7a0e0fc3d0541d237df2d0ec04f9e0" Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.056468 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ktd88" Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.150674 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq"] Dec 01 09:43:27 crc kubenswrapper[4867]: E1201 09:43:27.151171 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93968ab3-45b8-4b7a-a395-8344714bb9e9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.151198 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="93968ab3-45b8-4b7a-a395-8344714bb9e9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.151439 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="93968ab3-45b8-4b7a-a395-8344714bb9e9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.152263 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq" Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.155379 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.155527 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.155619 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zvcpg" Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.157382 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.162280 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq"] Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.316707 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d2px\" (UniqueName: \"kubernetes.io/projected/828b404a-aff1-4642-8893-d0ba513e520d-kube-api-access-8d2px\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq\" (UID: \"828b404a-aff1-4642-8893-d0ba513e520d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq" Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.316881 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/828b404a-aff1-4642-8893-d0ba513e520d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq\" (UID: \"828b404a-aff1-4642-8893-d0ba513e520d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq" Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.316968 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/828b404a-aff1-4642-8893-d0ba513e520d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq\" (UID: \"828b404a-aff1-4642-8893-d0ba513e520d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq" Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.418360 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d2px\" (UniqueName: \"kubernetes.io/projected/828b404a-aff1-4642-8893-d0ba513e520d-kube-api-access-8d2px\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq\" (UID: \"828b404a-aff1-4642-8893-d0ba513e520d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq" Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.419670 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/828b404a-aff1-4642-8893-d0ba513e520d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq\" (UID: \"828b404a-aff1-4642-8893-d0ba513e520d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq" Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.419885 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/828b404a-aff1-4642-8893-d0ba513e520d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq\" (UID: \"828b404a-aff1-4642-8893-d0ba513e520d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq" Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.437555 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/828b404a-aff1-4642-8893-d0ba513e520d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq\" (UID: \"828b404a-aff1-4642-8893-d0ba513e520d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq" Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.437791 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/828b404a-aff1-4642-8893-d0ba513e520d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq\" (UID: \"828b404a-aff1-4642-8893-d0ba513e520d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq" Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.440888 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d2px\" (UniqueName: \"kubernetes.io/projected/828b404a-aff1-4642-8893-d0ba513e520d-kube-api-access-8d2px\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq\" (UID: \"828b404a-aff1-4642-8893-d0ba513e520d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq" Dec 01 09:43:27 crc kubenswrapper[4867]: I1201 09:43:27.473350 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq" Dec 01 09:43:28 crc kubenswrapper[4867]: I1201 09:43:28.008063 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq"] Dec 01 09:43:28 crc kubenswrapper[4867]: W1201 09:43:28.027120 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod828b404a_aff1_4642_8893_d0ba513e520d.slice/crio-ad8b71a0300cfb6aaba3b41505fa5e6b353677920a5b3fac44170593cbf998c4 WatchSource:0}: Error finding container ad8b71a0300cfb6aaba3b41505fa5e6b353677920a5b3fac44170593cbf998c4: Status 404 returned error can't find the container with id ad8b71a0300cfb6aaba3b41505fa5e6b353677920a5b3fac44170593cbf998c4 Dec 01 09:43:28 crc kubenswrapper[4867]: I1201 09:43:28.031642 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:43:28 crc kubenswrapper[4867]: I1201 09:43:28.067470 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq" event={"ID":"828b404a-aff1-4642-8893-d0ba513e520d","Type":"ContainerStarted","Data":"ad8b71a0300cfb6aaba3b41505fa5e6b353677920a5b3fac44170593cbf998c4"} Dec 01 09:43:29 crc kubenswrapper[4867]: I1201 09:43:29.085627 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq" event={"ID":"828b404a-aff1-4642-8893-d0ba513e520d","Type":"ContainerStarted","Data":"ead83e40be6a6aa37227cd86bd0f2bf17f9dc2bd67fb318f4c872c8e624bc625"} Dec 01 09:43:29 crc kubenswrapper[4867]: I1201 09:43:29.110457 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq" podStartSLOduration=1.926921399 podStartE2EDuration="2.110439493s" podCreationTimestamp="2025-12-01 09:43:27 +0000 UTC" firstStartedPulling="2025-12-01 09:43:28.031322262 +0000 UTC m=+2129.490709026" lastFinishedPulling="2025-12-01 09:43:28.214840366 +0000 UTC m=+2129.674227120" observedRunningTime="2025-12-01 09:43:29.10705569 +0000 UTC m=+2130.566442444" watchObservedRunningTime="2025-12-01 09:43:29.110439493 +0000 UTC m=+2130.569826247" Dec 01 09:43:57 crc kubenswrapper[4867]: I1201 09:43:57.450622 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q62lm"] Dec 01 09:43:57 crc kubenswrapper[4867]: I1201 09:43:57.454436 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q62lm" Dec 01 09:43:57 crc kubenswrapper[4867]: I1201 09:43:57.476162 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q62lm"] Dec 01 09:43:57 crc kubenswrapper[4867]: I1201 09:43:57.592379 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb2700af-cf45-46d4-92e8-6dab22faa157-utilities\") pod \"redhat-marketplace-q62lm\" (UID: \"bb2700af-cf45-46d4-92e8-6dab22faa157\") " pod="openshift-marketplace/redhat-marketplace-q62lm" Dec 01 09:43:57 crc kubenswrapper[4867]: I1201 09:43:57.592442 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lndxp\" (UniqueName: \"kubernetes.io/projected/bb2700af-cf45-46d4-92e8-6dab22faa157-kube-api-access-lndxp\") pod \"redhat-marketplace-q62lm\" (UID: \"bb2700af-cf45-46d4-92e8-6dab22faa157\") " pod="openshift-marketplace/redhat-marketplace-q62lm" Dec 01 09:43:57 crc kubenswrapper[4867]: I1201 09:43:57.592500 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb2700af-cf45-46d4-92e8-6dab22faa157-catalog-content\") pod \"redhat-marketplace-q62lm\" (UID: \"bb2700af-cf45-46d4-92e8-6dab22faa157\") " pod="openshift-marketplace/redhat-marketplace-q62lm" Dec 01 09:43:57 crc kubenswrapper[4867]: I1201 09:43:57.694520 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb2700af-cf45-46d4-92e8-6dab22faa157-catalog-content\") pod \"redhat-marketplace-q62lm\" (UID: \"bb2700af-cf45-46d4-92e8-6dab22faa157\") " pod="openshift-marketplace/redhat-marketplace-q62lm" Dec 01 09:43:57 crc kubenswrapper[4867]: I1201 09:43:57.694707 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb2700af-cf45-46d4-92e8-6dab22faa157-utilities\") pod \"redhat-marketplace-q62lm\" (UID: \"bb2700af-cf45-46d4-92e8-6dab22faa157\") " pod="openshift-marketplace/redhat-marketplace-q62lm" Dec 01 09:43:57 crc kubenswrapper[4867]: I1201 09:43:57.694762 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lndxp\" (UniqueName: \"kubernetes.io/projected/bb2700af-cf45-46d4-92e8-6dab22faa157-kube-api-access-lndxp\") pod \"redhat-marketplace-q62lm\" (UID: \"bb2700af-cf45-46d4-92e8-6dab22faa157\") " pod="openshift-marketplace/redhat-marketplace-q62lm" Dec 01 09:43:57 crc kubenswrapper[4867]: I1201 09:43:57.695069 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb2700af-cf45-46d4-92e8-6dab22faa157-catalog-content\") pod \"redhat-marketplace-q62lm\" (UID: \"bb2700af-cf45-46d4-92e8-6dab22faa157\") " pod="openshift-marketplace/redhat-marketplace-q62lm" Dec 01 09:43:57 crc kubenswrapper[4867]: I1201 09:43:57.695302 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb2700af-cf45-46d4-92e8-6dab22faa157-utilities\") pod \"redhat-marketplace-q62lm\" (UID: \"bb2700af-cf45-46d4-92e8-6dab22faa157\") " pod="openshift-marketplace/redhat-marketplace-q62lm" Dec 01 09:43:57 crc kubenswrapper[4867]: I1201 09:43:57.714650 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lndxp\" (UniqueName: \"kubernetes.io/projected/bb2700af-cf45-46d4-92e8-6dab22faa157-kube-api-access-lndxp\") pod \"redhat-marketplace-q62lm\" (UID: \"bb2700af-cf45-46d4-92e8-6dab22faa157\") " pod="openshift-marketplace/redhat-marketplace-q62lm" Dec 01 09:43:57 crc kubenswrapper[4867]: I1201 09:43:57.782413 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q62lm" Dec 01 09:43:58 crc kubenswrapper[4867]: I1201 09:43:58.183244 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q62lm"] Dec 01 09:43:58 crc kubenswrapper[4867]: I1201 09:43:58.343006 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q62lm" event={"ID":"bb2700af-cf45-46d4-92e8-6dab22faa157","Type":"ContainerStarted","Data":"5dbc32fe285a5f65c9611f724002212580c09291a3a42ee023d33f1c81dcd062"} Dec 01 09:44:00 crc kubenswrapper[4867]: I1201 09:44:00.451472 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kn57f"] Dec 01 09:44:00 crc kubenswrapper[4867]: I1201 09:44:00.453631 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kn57f" Dec 01 09:44:00 crc kubenswrapper[4867]: I1201 09:44:00.475708 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kn57f"] Dec 01 09:44:00 crc kubenswrapper[4867]: I1201 09:44:00.544835 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa0aa429-09f2-49cf-8a90-e85a543511df-catalog-content\") pod \"redhat-operators-kn57f\" (UID: \"fa0aa429-09f2-49cf-8a90-e85a543511df\") " pod="openshift-marketplace/redhat-operators-kn57f" Dec 01 09:44:00 crc kubenswrapper[4867]: I1201 09:44:00.545043 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa0aa429-09f2-49cf-8a90-e85a543511df-utilities\") pod \"redhat-operators-kn57f\" (UID: \"fa0aa429-09f2-49cf-8a90-e85a543511df\") " pod="openshift-marketplace/redhat-operators-kn57f" Dec 01 09:44:00 crc kubenswrapper[4867]: I1201 09:44:00.545230 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdkj4\" (UniqueName: \"kubernetes.io/projected/fa0aa429-09f2-49cf-8a90-e85a543511df-kube-api-access-cdkj4\") pod \"redhat-operators-kn57f\" (UID: \"fa0aa429-09f2-49cf-8a90-e85a543511df\") " pod="openshift-marketplace/redhat-operators-kn57f" Dec 01 09:44:00 crc kubenswrapper[4867]: I1201 09:44:00.646966 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa0aa429-09f2-49cf-8a90-e85a543511df-utilities\") pod \"redhat-operators-kn57f\" (UID: \"fa0aa429-09f2-49cf-8a90-e85a543511df\") " pod="openshift-marketplace/redhat-operators-kn57f" Dec 01 09:44:00 crc kubenswrapper[4867]: I1201 09:44:00.647053 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdkj4\" (UniqueName: \"kubernetes.io/projected/fa0aa429-09f2-49cf-8a90-e85a543511df-kube-api-access-cdkj4\") pod \"redhat-operators-kn57f\" (UID: \"fa0aa429-09f2-49cf-8a90-e85a543511df\") " pod="openshift-marketplace/redhat-operators-kn57f" Dec 01 09:44:00 crc kubenswrapper[4867]: I1201 09:44:00.647309 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa0aa429-09f2-49cf-8a90-e85a543511df-utilities\") pod \"redhat-operators-kn57f\" (UID: \"fa0aa429-09f2-49cf-8a90-e85a543511df\") " pod="openshift-marketplace/redhat-operators-kn57f" Dec 01 09:44:00 crc kubenswrapper[4867]: I1201 09:44:00.647551 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa0aa429-09f2-49cf-8a90-e85a543511df-catalog-content\") pod \"redhat-operators-kn57f\" (UID: \"fa0aa429-09f2-49cf-8a90-e85a543511df\") " pod="openshift-marketplace/redhat-operators-kn57f" Dec 01 09:44:00 crc kubenswrapper[4867]: I1201 09:44:00.647876 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa0aa429-09f2-49cf-8a90-e85a543511df-catalog-content\") pod \"redhat-operators-kn57f\" (UID: \"fa0aa429-09f2-49cf-8a90-e85a543511df\") " pod="openshift-marketplace/redhat-operators-kn57f" Dec 01 09:44:00 crc kubenswrapper[4867]: I1201 09:44:00.665803 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdkj4\" (UniqueName: \"kubernetes.io/projected/fa0aa429-09f2-49cf-8a90-e85a543511df-kube-api-access-cdkj4\") pod \"redhat-operators-kn57f\" (UID: \"fa0aa429-09f2-49cf-8a90-e85a543511df\") " pod="openshift-marketplace/redhat-operators-kn57f" Dec 01 09:44:00 crc kubenswrapper[4867]: I1201 09:44:00.798298 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kn57f" Dec 01 09:44:01 crc kubenswrapper[4867]: I1201 09:44:01.331716 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kn57f"] Dec 01 09:44:01 crc kubenswrapper[4867]: I1201 09:44:01.381355 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kn57f" event={"ID":"fa0aa429-09f2-49cf-8a90-e85a543511df","Type":"ContainerStarted","Data":"439b1170bcc6f98fa425e64ec836d978fbbfa5a3fec57afa5b565a3e5b71f722"} Dec 01 09:44:01 crc kubenswrapper[4867]: I1201 09:44:01.450178 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zz648"] Dec 01 09:44:01 crc kubenswrapper[4867]: I1201 09:44:01.453044 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zz648" Dec 01 09:44:01 crc kubenswrapper[4867]: I1201 09:44:01.461497 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zz648"] Dec 01 09:44:01 crc kubenswrapper[4867]: I1201 09:44:01.570354 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvv6m\" (UniqueName: \"kubernetes.io/projected/11f59dbb-501a-4f55-9c9f-4a261a636d02-kube-api-access-fvv6m\") pod \"certified-operators-zz648\" (UID: \"11f59dbb-501a-4f55-9c9f-4a261a636d02\") " pod="openshift-marketplace/certified-operators-zz648" Dec 01 09:44:01 crc kubenswrapper[4867]: I1201 09:44:01.570774 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11f59dbb-501a-4f55-9c9f-4a261a636d02-catalog-content\") pod \"certified-operators-zz648\" (UID: \"11f59dbb-501a-4f55-9c9f-4a261a636d02\") " pod="openshift-marketplace/certified-operators-zz648" Dec 01 09:44:01 crc kubenswrapper[4867]: I1201 09:44:01.570996 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11f59dbb-501a-4f55-9c9f-4a261a636d02-utilities\") pod \"certified-operators-zz648\" (UID: \"11f59dbb-501a-4f55-9c9f-4a261a636d02\") " pod="openshift-marketplace/certified-operators-zz648" Dec 01 09:44:01 crc kubenswrapper[4867]: I1201 09:44:01.672708 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvv6m\" (UniqueName: \"kubernetes.io/projected/11f59dbb-501a-4f55-9c9f-4a261a636d02-kube-api-access-fvv6m\") pod \"certified-operators-zz648\" (UID: \"11f59dbb-501a-4f55-9c9f-4a261a636d02\") " pod="openshift-marketplace/certified-operators-zz648" Dec 01 09:44:01 crc kubenswrapper[4867]: I1201 09:44:01.672840 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11f59dbb-501a-4f55-9c9f-4a261a636d02-catalog-content\") pod \"certified-operators-zz648\" (UID: \"11f59dbb-501a-4f55-9c9f-4a261a636d02\") " pod="openshift-marketplace/certified-operators-zz648" Dec 01 09:44:01 crc kubenswrapper[4867]: I1201 09:44:01.672903 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11f59dbb-501a-4f55-9c9f-4a261a636d02-utilities\") pod \"certified-operators-zz648\" (UID: \"11f59dbb-501a-4f55-9c9f-4a261a636d02\") " pod="openshift-marketplace/certified-operators-zz648" Dec 01 09:44:01 crc kubenswrapper[4867]: I1201 09:44:01.673404 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11f59dbb-501a-4f55-9c9f-4a261a636d02-utilities\") pod \"certified-operators-zz648\" (UID: \"11f59dbb-501a-4f55-9c9f-4a261a636d02\") " pod="openshift-marketplace/certified-operators-zz648" Dec 01 09:44:01 crc kubenswrapper[4867]: I1201 09:44:01.673931 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11f59dbb-501a-4f55-9c9f-4a261a636d02-catalog-content\") pod \"certified-operators-zz648\" (UID: \"11f59dbb-501a-4f55-9c9f-4a261a636d02\") " pod="openshift-marketplace/certified-operators-zz648" Dec 01 09:44:01 crc kubenswrapper[4867]: I1201 09:44:01.723037 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvv6m\" (UniqueName: \"kubernetes.io/projected/11f59dbb-501a-4f55-9c9f-4a261a636d02-kube-api-access-fvv6m\") pod \"certified-operators-zz648\" (UID: \"11f59dbb-501a-4f55-9c9f-4a261a636d02\") " pod="openshift-marketplace/certified-operators-zz648" Dec 01 09:44:01 crc kubenswrapper[4867]: I1201 09:44:01.768437 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zz648" Dec 01 09:44:02 crc kubenswrapper[4867]: I1201 09:44:02.216030 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zz648"] Dec 01 09:44:02 crc kubenswrapper[4867]: W1201 09:44:02.221907 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11f59dbb_501a_4f55_9c9f_4a261a636d02.slice/crio-c5bbfa16b3c975522444ce2b3529aee9a21b32bb70325af70fed4100b58b4eb4 WatchSource:0}: Error finding container c5bbfa16b3c975522444ce2b3529aee9a21b32bb70325af70fed4100b58b4eb4: Status 404 returned error can't find the container with id c5bbfa16b3c975522444ce2b3529aee9a21b32bb70325af70fed4100b58b4eb4 Dec 01 09:44:02 crc kubenswrapper[4867]: I1201 09:44:02.392671 4867 generic.go:334] "Generic (PLEG): container finished" podID="bb2700af-cf45-46d4-92e8-6dab22faa157" containerID="a5480fbe96ddc9f4021e26b564bc3d1866b37a63245959853f54902ee0f0ad28" exitCode=0 Dec 01 09:44:02 crc kubenswrapper[4867]: I1201 09:44:02.392957 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q62lm" event={"ID":"bb2700af-cf45-46d4-92e8-6dab22faa157","Type":"ContainerDied","Data":"a5480fbe96ddc9f4021e26b564bc3d1866b37a63245959853f54902ee0f0ad28"} Dec 01 09:44:02 crc kubenswrapper[4867]: I1201 09:44:02.396555 4867 generic.go:334] "Generic (PLEG): container finished" podID="fa0aa429-09f2-49cf-8a90-e85a543511df" containerID="87c3bca540eafe82cae5197de6d6afbf04e73c4d8b30a054d8d2c37ab4c36fe3" exitCode=0 Dec 01 09:44:02 crc kubenswrapper[4867]: I1201 09:44:02.396621 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kn57f" event={"ID":"fa0aa429-09f2-49cf-8a90-e85a543511df","Type":"ContainerDied","Data":"87c3bca540eafe82cae5197de6d6afbf04e73c4d8b30a054d8d2c37ab4c36fe3"} Dec 01 09:44:02 crc kubenswrapper[4867]: I1201 09:44:02.398944 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zz648" event={"ID":"11f59dbb-501a-4f55-9c9f-4a261a636d02","Type":"ContainerStarted","Data":"c5bbfa16b3c975522444ce2b3529aee9a21b32bb70325af70fed4100b58b4eb4"} Dec 01 09:44:03 crc kubenswrapper[4867]: I1201 09:44:03.423033 4867 generic.go:334] "Generic (PLEG): container finished" podID="11f59dbb-501a-4f55-9c9f-4a261a636d02" containerID="2d897bbcb12947069b61578357ca1d6b737c154e5fa2fea89af43a5cdf9d9d26" exitCode=0 Dec 01 09:44:03 crc kubenswrapper[4867]: I1201 09:44:03.423212 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zz648" event={"ID":"11f59dbb-501a-4f55-9c9f-4a261a636d02","Type":"ContainerDied","Data":"2d897bbcb12947069b61578357ca1d6b737c154e5fa2fea89af43a5cdf9d9d26"} Dec 01 09:44:04 crc kubenswrapper[4867]: I1201 09:44:04.432834 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q62lm" event={"ID":"bb2700af-cf45-46d4-92e8-6dab22faa157","Type":"ContainerStarted","Data":"61475c85ff3e157f9d9946df21956d5387f1566d7019be177c769f105f4efc59"} Dec 01 09:44:04 crc kubenswrapper[4867]: I1201 09:44:04.435525 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kn57f" event={"ID":"fa0aa429-09f2-49cf-8a90-e85a543511df","Type":"ContainerStarted","Data":"c959864e4badd8349ecdd195ad00e2d21adade16cd3ee0c34d59301146cf140c"} Dec 01 09:44:05 crc kubenswrapper[4867]: I1201 09:44:05.447770 4867 generic.go:334] "Generic (PLEG): container finished" podID="bb2700af-cf45-46d4-92e8-6dab22faa157" containerID="61475c85ff3e157f9d9946df21956d5387f1566d7019be177c769f105f4efc59" exitCode=0 Dec 01 09:44:05 crc kubenswrapper[4867]: I1201 09:44:05.447855 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q62lm" event={"ID":"bb2700af-cf45-46d4-92e8-6dab22faa157","Type":"ContainerDied","Data":"61475c85ff3e157f9d9946df21956d5387f1566d7019be177c769f105f4efc59"} Dec 01 09:44:05 crc kubenswrapper[4867]: I1201 09:44:05.453097 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zz648" event={"ID":"11f59dbb-501a-4f55-9c9f-4a261a636d02","Type":"ContainerStarted","Data":"cbecbfc06b0da0fe3aae2c2a38f1abf4091e69e328c969b3b198213a651aa865"} Dec 01 09:44:07 crc kubenswrapper[4867]: I1201 09:44:07.470959 4867 generic.go:334] "Generic (PLEG): container finished" podID="11f59dbb-501a-4f55-9c9f-4a261a636d02" containerID="cbecbfc06b0da0fe3aae2c2a38f1abf4091e69e328c969b3b198213a651aa865" exitCode=0 Dec 01 09:44:07 crc kubenswrapper[4867]: I1201 09:44:07.471104 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zz648" event={"ID":"11f59dbb-501a-4f55-9c9f-4a261a636d02","Type":"ContainerDied","Data":"cbecbfc06b0da0fe3aae2c2a38f1abf4091e69e328c969b3b198213a651aa865"} Dec 01 09:44:07 crc kubenswrapper[4867]: I1201 09:44:07.477223 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q62lm" event={"ID":"bb2700af-cf45-46d4-92e8-6dab22faa157","Type":"ContainerStarted","Data":"407d3a3bb864dffd40c9aa6c8010e2de7ef3c01f1a2ba2017df2fc7f2e266c90"} Dec 01 09:44:07 crc kubenswrapper[4867]: I1201 09:44:07.523565 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q62lm" podStartSLOduration=6.774008375 podStartE2EDuration="10.523541983s" podCreationTimestamp="2025-12-01 09:43:57 +0000 UTC" firstStartedPulling="2025-12-01 09:44:02.394970221 +0000 UTC m=+2163.854356975" lastFinishedPulling="2025-12-01 09:44:06.144503819 +0000 UTC m=+2167.603890583" observedRunningTime="2025-12-01 09:44:07.519675837 +0000 UTC m=+2168.979062601" watchObservedRunningTime="2025-12-01 09:44:07.523541983 +0000 UTC m=+2168.982928737" Dec 01 09:44:07 crc kubenswrapper[4867]: I1201 09:44:07.783595 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q62lm" Dec 01 09:44:07 crc kubenswrapper[4867]: I1201 09:44:07.783639 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q62lm" Dec 01 09:44:08 crc kubenswrapper[4867]: I1201 09:44:08.840428 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-q62lm" podUID="bb2700af-cf45-46d4-92e8-6dab22faa157" containerName="registry-server" probeResult="failure" output=< Dec 01 09:44:08 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Dec 01 09:44:08 crc kubenswrapper[4867]: > Dec 01 09:44:09 crc kubenswrapper[4867]: I1201 09:44:09.508522 4867 generic.go:334] "Generic (PLEG): container finished" podID="fa0aa429-09f2-49cf-8a90-e85a543511df" containerID="c959864e4badd8349ecdd195ad00e2d21adade16cd3ee0c34d59301146cf140c" exitCode=0 Dec 01 09:44:09 crc kubenswrapper[4867]: I1201 09:44:09.508611 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kn57f" event={"ID":"fa0aa429-09f2-49cf-8a90-e85a543511df","Type":"ContainerDied","Data":"c959864e4badd8349ecdd195ad00e2d21adade16cd3ee0c34d59301146cf140c"} Dec 01 09:44:09 crc kubenswrapper[4867]: I1201 09:44:09.511218 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zz648" event={"ID":"11f59dbb-501a-4f55-9c9f-4a261a636d02","Type":"ContainerStarted","Data":"5b1a61ef19ca77ec29c55ab41b4f264fba5c44659775e655cc33430a73c0865a"} Dec 01 09:44:09 crc kubenswrapper[4867]: I1201 09:44:09.556741 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zz648" podStartSLOduration=4.000585917 podStartE2EDuration="8.556720749s" podCreationTimestamp="2025-12-01 09:44:01 +0000 UTC" firstStartedPulling="2025-12-01 09:44:03.427805495 +0000 UTC m=+2164.887192249" lastFinishedPulling="2025-12-01 09:44:07.983940327 +0000 UTC m=+2169.443327081" observedRunningTime="2025-12-01 09:44:09.549760407 +0000 UTC m=+2171.009147161" watchObservedRunningTime="2025-12-01 09:44:09.556720749 +0000 UTC m=+2171.016107503" Dec 01 09:44:10 crc kubenswrapper[4867]: I1201 09:44:10.523907 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kn57f" event={"ID":"fa0aa429-09f2-49cf-8a90-e85a543511df","Type":"ContainerStarted","Data":"7277f19362784ff9d01ade71e2221f56bbf61851f77162d3e565677a71705b83"} Dec 01 09:44:10 crc kubenswrapper[4867]: I1201 09:44:10.553400 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kn57f" podStartSLOduration=3.022904023 podStartE2EDuration="10.55338032s" podCreationTimestamp="2025-12-01 09:44:00 +0000 UTC" firstStartedPulling="2025-12-01 09:44:02.400657127 +0000 UTC m=+2163.860043881" lastFinishedPulling="2025-12-01 09:44:09.931133424 +0000 UTC m=+2171.390520178" observedRunningTime="2025-12-01 09:44:10.546435729 +0000 UTC m=+2172.005822483" watchObservedRunningTime="2025-12-01 09:44:10.55338032 +0000 UTC m=+2172.012767074" Dec 01 09:44:10 crc kubenswrapper[4867]: I1201 09:44:10.801681 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kn57f" Dec 01 09:44:10 crc kubenswrapper[4867]: I1201 09:44:10.801736 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kn57f" Dec 01 09:44:11 crc kubenswrapper[4867]: I1201 09:44:11.769453 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zz648" Dec 01 09:44:11 crc kubenswrapper[4867]: I1201 09:44:11.769788 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zz648" Dec 01 09:44:11 crc kubenswrapper[4867]: I1201 09:44:11.861975 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kn57f" podUID="fa0aa429-09f2-49cf-8a90-e85a543511df" containerName="registry-server" probeResult="failure" output=< Dec 01 09:44:11 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Dec 01 09:44:11 crc kubenswrapper[4867]: > Dec 01 09:44:12 crc kubenswrapper[4867]: I1201 09:44:12.934095 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-zz648" podUID="11f59dbb-501a-4f55-9c9f-4a261a636d02" containerName="registry-server" probeResult="failure" output=< Dec 01 09:44:12 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Dec 01 09:44:12 crc kubenswrapper[4867]: > Dec 01 09:44:17 crc kubenswrapper[4867]: I1201 09:44:17.836920 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q62lm" Dec 01 09:44:17 crc kubenswrapper[4867]: I1201 09:44:17.893278 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q62lm" Dec 01 09:44:18 crc kubenswrapper[4867]: I1201 09:44:18.079093 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q62lm"] Dec 01 09:44:19 crc kubenswrapper[4867]: I1201 09:44:19.604103 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q62lm" podUID="bb2700af-cf45-46d4-92e8-6dab22faa157" containerName="registry-server" containerID="cri-o://407d3a3bb864dffd40c9aa6c8010e2de7ef3c01f1a2ba2017df2fc7f2e266c90" gracePeriod=2 Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.098476 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q62lm" Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.161166 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb2700af-cf45-46d4-92e8-6dab22faa157-utilities\") pod \"bb2700af-cf45-46d4-92e8-6dab22faa157\" (UID: \"bb2700af-cf45-46d4-92e8-6dab22faa157\") " Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.161232 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb2700af-cf45-46d4-92e8-6dab22faa157-catalog-content\") pod \"bb2700af-cf45-46d4-92e8-6dab22faa157\" (UID: \"bb2700af-cf45-46d4-92e8-6dab22faa157\") " Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.161353 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lndxp\" (UniqueName: \"kubernetes.io/projected/bb2700af-cf45-46d4-92e8-6dab22faa157-kube-api-access-lndxp\") pod \"bb2700af-cf45-46d4-92e8-6dab22faa157\" (UID: \"bb2700af-cf45-46d4-92e8-6dab22faa157\") " Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.162562 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb2700af-cf45-46d4-92e8-6dab22faa157-utilities" (OuterVolumeSpecName: "utilities") pod "bb2700af-cf45-46d4-92e8-6dab22faa157" (UID: "bb2700af-cf45-46d4-92e8-6dab22faa157"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.178052 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb2700af-cf45-46d4-92e8-6dab22faa157-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb2700af-cf45-46d4-92e8-6dab22faa157" (UID: "bb2700af-cf45-46d4-92e8-6dab22faa157"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.190168 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb2700af-cf45-46d4-92e8-6dab22faa157-kube-api-access-lndxp" (OuterVolumeSpecName: "kube-api-access-lndxp") pod "bb2700af-cf45-46d4-92e8-6dab22faa157" (UID: "bb2700af-cf45-46d4-92e8-6dab22faa157"). InnerVolumeSpecName "kube-api-access-lndxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.263991 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lndxp\" (UniqueName: \"kubernetes.io/projected/bb2700af-cf45-46d4-92e8-6dab22faa157-kube-api-access-lndxp\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.264260 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb2700af-cf45-46d4-92e8-6dab22faa157-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.264279 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb2700af-cf45-46d4-92e8-6dab22faa157-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.626349 4867 generic.go:334] "Generic (PLEG): container finished" podID="bb2700af-cf45-46d4-92e8-6dab22faa157" containerID="407d3a3bb864dffd40c9aa6c8010e2de7ef3c01f1a2ba2017df2fc7f2e266c90" exitCode=0 Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.626395 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q62lm" event={"ID":"bb2700af-cf45-46d4-92e8-6dab22faa157","Type":"ContainerDied","Data":"407d3a3bb864dffd40c9aa6c8010e2de7ef3c01f1a2ba2017df2fc7f2e266c90"} Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.626421 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q62lm" event={"ID":"bb2700af-cf45-46d4-92e8-6dab22faa157","Type":"ContainerDied","Data":"5dbc32fe285a5f65c9611f724002212580c09291a3a42ee023d33f1c81dcd062"} Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.626436 4867 scope.go:117] "RemoveContainer" containerID="407d3a3bb864dffd40c9aa6c8010e2de7ef3c01f1a2ba2017df2fc7f2e266c90" Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.626585 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q62lm" Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.696906 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q62lm"] Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.711237 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q62lm"] Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.726721 4867 scope.go:117] "RemoveContainer" containerID="61475c85ff3e157f9d9946df21956d5387f1566d7019be177c769f105f4efc59" Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.754858 4867 scope.go:117] "RemoveContainer" containerID="a5480fbe96ddc9f4021e26b564bc3d1866b37a63245959853f54902ee0f0ad28" Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.801165 4867 scope.go:117] "RemoveContainer" containerID="407d3a3bb864dffd40c9aa6c8010e2de7ef3c01f1a2ba2017df2fc7f2e266c90" Dec 01 09:44:20 crc kubenswrapper[4867]: E1201 09:44:20.803894 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"407d3a3bb864dffd40c9aa6c8010e2de7ef3c01f1a2ba2017df2fc7f2e266c90\": container with ID starting with 407d3a3bb864dffd40c9aa6c8010e2de7ef3c01f1a2ba2017df2fc7f2e266c90 not found: ID does not exist" containerID="407d3a3bb864dffd40c9aa6c8010e2de7ef3c01f1a2ba2017df2fc7f2e266c90" Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.804123 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"407d3a3bb864dffd40c9aa6c8010e2de7ef3c01f1a2ba2017df2fc7f2e266c90"} err="failed to get container status \"407d3a3bb864dffd40c9aa6c8010e2de7ef3c01f1a2ba2017df2fc7f2e266c90\": rpc error: code = NotFound desc = could not find container \"407d3a3bb864dffd40c9aa6c8010e2de7ef3c01f1a2ba2017df2fc7f2e266c90\": container with ID starting with 407d3a3bb864dffd40c9aa6c8010e2de7ef3c01f1a2ba2017df2fc7f2e266c90 not found: ID does not exist" Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.804151 4867 scope.go:117] "RemoveContainer" containerID="61475c85ff3e157f9d9946df21956d5387f1566d7019be177c769f105f4efc59" Dec 01 09:44:20 crc kubenswrapper[4867]: E1201 09:44:20.804576 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61475c85ff3e157f9d9946df21956d5387f1566d7019be177c769f105f4efc59\": container with ID starting with 61475c85ff3e157f9d9946df21956d5387f1566d7019be177c769f105f4efc59 not found: ID does not exist" containerID="61475c85ff3e157f9d9946df21956d5387f1566d7019be177c769f105f4efc59" Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.804603 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61475c85ff3e157f9d9946df21956d5387f1566d7019be177c769f105f4efc59"} err="failed to get container status \"61475c85ff3e157f9d9946df21956d5387f1566d7019be177c769f105f4efc59\": rpc error: code = NotFound desc = could not find container \"61475c85ff3e157f9d9946df21956d5387f1566d7019be177c769f105f4efc59\": container with ID starting with 61475c85ff3e157f9d9946df21956d5387f1566d7019be177c769f105f4efc59 not found: ID does not exist" Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.804620 4867 scope.go:117] "RemoveContainer" containerID="a5480fbe96ddc9f4021e26b564bc3d1866b37a63245959853f54902ee0f0ad28" Dec 01 09:44:20 crc kubenswrapper[4867]: E1201 09:44:20.805031 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5480fbe96ddc9f4021e26b564bc3d1866b37a63245959853f54902ee0f0ad28\": container with ID starting with a5480fbe96ddc9f4021e26b564bc3d1866b37a63245959853f54902ee0f0ad28 not found: ID does not exist" containerID="a5480fbe96ddc9f4021e26b564bc3d1866b37a63245959853f54902ee0f0ad28" Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.805060 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5480fbe96ddc9f4021e26b564bc3d1866b37a63245959853f54902ee0f0ad28"} err="failed to get container status \"a5480fbe96ddc9f4021e26b564bc3d1866b37a63245959853f54902ee0f0ad28\": rpc error: code = NotFound desc = could not find container \"a5480fbe96ddc9f4021e26b564bc3d1866b37a63245959853f54902ee0f0ad28\": container with ID starting with a5480fbe96ddc9f4021e26b564bc3d1866b37a63245959853f54902ee0f0ad28 not found: ID does not exist" Dec 01 09:44:20 crc kubenswrapper[4867]: I1201 09:44:20.842298 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb2700af-cf45-46d4-92e8-6dab22faa157" path="/var/lib/kubelet/pods/bb2700af-cf45-46d4-92e8-6dab22faa157/volumes" Dec 01 09:44:21 crc kubenswrapper[4867]: I1201 09:44:21.601605 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:44:21 crc kubenswrapper[4867]: I1201 09:44:21.601682 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:44:21 crc kubenswrapper[4867]: I1201 09:44:21.851067 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zz648" Dec 01 09:44:21 crc kubenswrapper[4867]: I1201 09:44:21.871462 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kn57f" podUID="fa0aa429-09f2-49cf-8a90-e85a543511df" containerName="registry-server" probeResult="failure" output=< Dec 01 09:44:21 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Dec 01 09:44:21 crc kubenswrapper[4867]: > Dec 01 09:44:21 crc kubenswrapper[4867]: I1201 09:44:21.917516 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zz648" Dec 01 09:44:22 crc kubenswrapper[4867]: I1201 09:44:22.476265 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zz648"] Dec 01 09:44:23 crc kubenswrapper[4867]: I1201 09:44:23.648697 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zz648" podUID="11f59dbb-501a-4f55-9c9f-4a261a636d02" containerName="registry-server" containerID="cri-o://5b1a61ef19ca77ec29c55ab41b4f264fba5c44659775e655cc33430a73c0865a" gracePeriod=2 Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.175586 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zz648" Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.251761 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11f59dbb-501a-4f55-9c9f-4a261a636d02-utilities\") pod \"11f59dbb-501a-4f55-9c9f-4a261a636d02\" (UID: \"11f59dbb-501a-4f55-9c9f-4a261a636d02\") " Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.251871 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvv6m\" (UniqueName: \"kubernetes.io/projected/11f59dbb-501a-4f55-9c9f-4a261a636d02-kube-api-access-fvv6m\") pod \"11f59dbb-501a-4f55-9c9f-4a261a636d02\" (UID: \"11f59dbb-501a-4f55-9c9f-4a261a636d02\") " Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.252020 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11f59dbb-501a-4f55-9c9f-4a261a636d02-catalog-content\") pod \"11f59dbb-501a-4f55-9c9f-4a261a636d02\" (UID: \"11f59dbb-501a-4f55-9c9f-4a261a636d02\") " Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.252502 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11f59dbb-501a-4f55-9c9f-4a261a636d02-utilities" (OuterVolumeSpecName: "utilities") pod "11f59dbb-501a-4f55-9c9f-4a261a636d02" (UID: "11f59dbb-501a-4f55-9c9f-4a261a636d02"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.259112 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11f59dbb-501a-4f55-9c9f-4a261a636d02-kube-api-access-fvv6m" (OuterVolumeSpecName: "kube-api-access-fvv6m") pod "11f59dbb-501a-4f55-9c9f-4a261a636d02" (UID: "11f59dbb-501a-4f55-9c9f-4a261a636d02"). InnerVolumeSpecName "kube-api-access-fvv6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.307864 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11f59dbb-501a-4f55-9c9f-4a261a636d02-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11f59dbb-501a-4f55-9c9f-4a261a636d02" (UID: "11f59dbb-501a-4f55-9c9f-4a261a636d02"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.354077 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvv6m\" (UniqueName: \"kubernetes.io/projected/11f59dbb-501a-4f55-9c9f-4a261a636d02-kube-api-access-fvv6m\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.354133 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11f59dbb-501a-4f55-9c9f-4a261a636d02-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.354145 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11f59dbb-501a-4f55-9c9f-4a261a636d02-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.660914 4867 generic.go:334] "Generic (PLEG): container finished" podID="11f59dbb-501a-4f55-9c9f-4a261a636d02" containerID="5b1a61ef19ca77ec29c55ab41b4f264fba5c44659775e655cc33430a73c0865a" exitCode=0 Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.661000 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zz648" Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.661018 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zz648" event={"ID":"11f59dbb-501a-4f55-9c9f-4a261a636d02","Type":"ContainerDied","Data":"5b1a61ef19ca77ec29c55ab41b4f264fba5c44659775e655cc33430a73c0865a"} Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.661409 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zz648" event={"ID":"11f59dbb-501a-4f55-9c9f-4a261a636d02","Type":"ContainerDied","Data":"c5bbfa16b3c975522444ce2b3529aee9a21b32bb70325af70fed4100b58b4eb4"} Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.661436 4867 scope.go:117] "RemoveContainer" containerID="5b1a61ef19ca77ec29c55ab41b4f264fba5c44659775e655cc33430a73c0865a" Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.686599 4867 scope.go:117] "RemoveContainer" containerID="cbecbfc06b0da0fe3aae2c2a38f1abf4091e69e328c969b3b198213a651aa865" Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.698190 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zz648"] Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.709623 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zz648"] Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.725898 4867 scope.go:117] "RemoveContainer" containerID="2d897bbcb12947069b61578357ca1d6b737c154e5fa2fea89af43a5cdf9d9d26" Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.764255 4867 scope.go:117] "RemoveContainer" containerID="5b1a61ef19ca77ec29c55ab41b4f264fba5c44659775e655cc33430a73c0865a" Dec 01 09:44:24 crc kubenswrapper[4867]: E1201 09:44:24.764656 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b1a61ef19ca77ec29c55ab41b4f264fba5c44659775e655cc33430a73c0865a\": container with ID starting with 5b1a61ef19ca77ec29c55ab41b4f264fba5c44659775e655cc33430a73c0865a not found: ID does not exist" containerID="5b1a61ef19ca77ec29c55ab41b4f264fba5c44659775e655cc33430a73c0865a" Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.764699 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b1a61ef19ca77ec29c55ab41b4f264fba5c44659775e655cc33430a73c0865a"} err="failed to get container status \"5b1a61ef19ca77ec29c55ab41b4f264fba5c44659775e655cc33430a73c0865a\": rpc error: code = NotFound desc = could not find container \"5b1a61ef19ca77ec29c55ab41b4f264fba5c44659775e655cc33430a73c0865a\": container with ID starting with 5b1a61ef19ca77ec29c55ab41b4f264fba5c44659775e655cc33430a73c0865a not found: ID does not exist" Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.764728 4867 scope.go:117] "RemoveContainer" containerID="cbecbfc06b0da0fe3aae2c2a38f1abf4091e69e328c969b3b198213a651aa865" Dec 01 09:44:24 crc kubenswrapper[4867]: E1201 09:44:24.765327 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbecbfc06b0da0fe3aae2c2a38f1abf4091e69e328c969b3b198213a651aa865\": container with ID starting with cbecbfc06b0da0fe3aae2c2a38f1abf4091e69e328c969b3b198213a651aa865 not found: ID does not exist" containerID="cbecbfc06b0da0fe3aae2c2a38f1abf4091e69e328c969b3b198213a651aa865" Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.765379 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbecbfc06b0da0fe3aae2c2a38f1abf4091e69e328c969b3b198213a651aa865"} err="failed to get container status \"cbecbfc06b0da0fe3aae2c2a38f1abf4091e69e328c969b3b198213a651aa865\": rpc error: code = NotFound desc = could not find container \"cbecbfc06b0da0fe3aae2c2a38f1abf4091e69e328c969b3b198213a651aa865\": container with ID starting with cbecbfc06b0da0fe3aae2c2a38f1abf4091e69e328c969b3b198213a651aa865 not found: ID does not exist" Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.765393 4867 scope.go:117] "RemoveContainer" containerID="2d897bbcb12947069b61578357ca1d6b737c154e5fa2fea89af43a5cdf9d9d26" Dec 01 09:44:24 crc kubenswrapper[4867]: E1201 09:44:24.765777 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d897bbcb12947069b61578357ca1d6b737c154e5fa2fea89af43a5cdf9d9d26\": container with ID starting with 2d897bbcb12947069b61578357ca1d6b737c154e5fa2fea89af43a5cdf9d9d26 not found: ID does not exist" containerID="2d897bbcb12947069b61578357ca1d6b737c154e5fa2fea89af43a5cdf9d9d26" Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.765864 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d897bbcb12947069b61578357ca1d6b737c154e5fa2fea89af43a5cdf9d9d26"} err="failed to get container status \"2d897bbcb12947069b61578357ca1d6b737c154e5fa2fea89af43a5cdf9d9d26\": rpc error: code = NotFound desc = could not find container \"2d897bbcb12947069b61578357ca1d6b737c154e5fa2fea89af43a5cdf9d9d26\": container with ID starting with 2d897bbcb12947069b61578357ca1d6b737c154e5fa2fea89af43a5cdf9d9d26 not found: ID does not exist" Dec 01 09:44:24 crc kubenswrapper[4867]: I1201 09:44:24.840909 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11f59dbb-501a-4f55-9c9f-4a261a636d02" path="/var/lib/kubelet/pods/11f59dbb-501a-4f55-9c9f-4a261a636d02/volumes" Dec 01 09:44:29 crc kubenswrapper[4867]: I1201 09:44:29.704729 4867 generic.go:334] "Generic (PLEG): container finished" podID="828b404a-aff1-4642-8893-d0ba513e520d" containerID="ead83e40be6a6aa37227cd86bd0f2bf17f9dc2bd67fb318f4c872c8e624bc625" exitCode=0 Dec 01 09:44:29 crc kubenswrapper[4867]: I1201 09:44:29.704840 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq" event={"ID":"828b404a-aff1-4642-8893-d0ba513e520d","Type":"ContainerDied","Data":"ead83e40be6a6aa37227cd86bd0f2bf17f9dc2bd67fb318f4c872c8e624bc625"} Dec 01 09:44:30 crc kubenswrapper[4867]: I1201 09:44:30.852645 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kn57f" Dec 01 09:44:30 crc kubenswrapper[4867]: I1201 09:44:30.913494 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kn57f" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.119607 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.178803 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d2px\" (UniqueName: \"kubernetes.io/projected/828b404a-aff1-4642-8893-d0ba513e520d-kube-api-access-8d2px\") pod \"828b404a-aff1-4642-8893-d0ba513e520d\" (UID: \"828b404a-aff1-4642-8893-d0ba513e520d\") " Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.178919 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/828b404a-aff1-4642-8893-d0ba513e520d-ssh-key\") pod \"828b404a-aff1-4642-8893-d0ba513e520d\" (UID: \"828b404a-aff1-4642-8893-d0ba513e520d\") " Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.178971 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/828b404a-aff1-4642-8893-d0ba513e520d-inventory\") pod \"828b404a-aff1-4642-8893-d0ba513e520d\" (UID: \"828b404a-aff1-4642-8893-d0ba513e520d\") " Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.183910 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/828b404a-aff1-4642-8893-d0ba513e520d-kube-api-access-8d2px" (OuterVolumeSpecName: "kube-api-access-8d2px") pod "828b404a-aff1-4642-8893-d0ba513e520d" (UID: "828b404a-aff1-4642-8893-d0ba513e520d"). InnerVolumeSpecName "kube-api-access-8d2px". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.205380 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/828b404a-aff1-4642-8893-d0ba513e520d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "828b404a-aff1-4642-8893-d0ba513e520d" (UID: "828b404a-aff1-4642-8893-d0ba513e520d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.217025 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/828b404a-aff1-4642-8893-d0ba513e520d-inventory" (OuterVolumeSpecName: "inventory") pod "828b404a-aff1-4642-8893-d0ba513e520d" (UID: "828b404a-aff1-4642-8893-d0ba513e520d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.280531 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d2px\" (UniqueName: \"kubernetes.io/projected/828b404a-aff1-4642-8893-d0ba513e520d-kube-api-access-8d2px\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.280573 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/828b404a-aff1-4642-8893-d0ba513e520d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.280582 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/828b404a-aff1-4642-8893-d0ba513e520d-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.723184 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.723184 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq" event={"ID":"828b404a-aff1-4642-8893-d0ba513e520d","Type":"ContainerDied","Data":"ad8b71a0300cfb6aaba3b41505fa5e6b353677920a5b3fac44170593cbf998c4"} Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.723604 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad8b71a0300cfb6aaba3b41505fa5e6b353677920a5b3fac44170593cbf998c4" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.817552 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2glfk"] Dec 01 09:44:31 crc kubenswrapper[4867]: E1201 09:44:31.817931 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb2700af-cf45-46d4-92e8-6dab22faa157" containerName="registry-server" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.817951 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2700af-cf45-46d4-92e8-6dab22faa157" containerName="registry-server" Dec 01 09:44:31 crc kubenswrapper[4867]: E1201 09:44:31.817965 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11f59dbb-501a-4f55-9c9f-4a261a636d02" containerName="extract-content" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.817971 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f59dbb-501a-4f55-9c9f-4a261a636d02" containerName="extract-content" Dec 01 09:44:31 crc kubenswrapper[4867]: E1201 09:44:31.817988 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb2700af-cf45-46d4-92e8-6dab22faa157" containerName="extract-content" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.817995 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2700af-cf45-46d4-92e8-6dab22faa157" containerName="extract-content" Dec 01 09:44:31 crc kubenswrapper[4867]: E1201 09:44:31.818011 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11f59dbb-501a-4f55-9c9f-4a261a636d02" containerName="extract-utilities" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.818017 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f59dbb-501a-4f55-9c9f-4a261a636d02" containerName="extract-utilities" Dec 01 09:44:31 crc kubenswrapper[4867]: E1201 09:44:31.818026 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb2700af-cf45-46d4-92e8-6dab22faa157" containerName="extract-utilities" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.818032 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2700af-cf45-46d4-92e8-6dab22faa157" containerName="extract-utilities" Dec 01 09:44:31 crc kubenswrapper[4867]: E1201 09:44:31.818042 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11f59dbb-501a-4f55-9c9f-4a261a636d02" containerName="registry-server" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.818048 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f59dbb-501a-4f55-9c9f-4a261a636d02" containerName="registry-server" Dec 01 09:44:31 crc kubenswrapper[4867]: E1201 09:44:31.818065 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="828b404a-aff1-4642-8893-d0ba513e520d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.818073 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="828b404a-aff1-4642-8893-d0ba513e520d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.818259 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="11f59dbb-501a-4f55-9c9f-4a261a636d02" containerName="registry-server" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.818313 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="828b404a-aff1-4642-8893-d0ba513e520d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.818323 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb2700af-cf45-46d4-92e8-6dab22faa157" containerName="registry-server" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.818984 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2glfk" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.821392 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.822919 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.829329 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.832901 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zvcpg" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.836882 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2glfk"] Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.860118 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kn57f"] Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.892069 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b20bdd78-fe72-4ce9-b909-440d2e47e153-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2glfk\" (UID: \"b20bdd78-fe72-4ce9-b909-440d2e47e153\") " pod="openstack/ssh-known-hosts-edpm-deployment-2glfk" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.892256 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b20bdd78-fe72-4ce9-b909-440d2e47e153-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2glfk\" (UID: \"b20bdd78-fe72-4ce9-b909-440d2e47e153\") " pod="openstack/ssh-known-hosts-edpm-deployment-2glfk" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.892289 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jrnn\" (UniqueName: \"kubernetes.io/projected/b20bdd78-fe72-4ce9-b909-440d2e47e153-kube-api-access-5jrnn\") pod \"ssh-known-hosts-edpm-deployment-2glfk\" (UID: \"b20bdd78-fe72-4ce9-b909-440d2e47e153\") " pod="openstack/ssh-known-hosts-edpm-deployment-2glfk" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.993437 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b20bdd78-fe72-4ce9-b909-440d2e47e153-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2glfk\" (UID: \"b20bdd78-fe72-4ce9-b909-440d2e47e153\") " pod="openstack/ssh-known-hosts-edpm-deployment-2glfk" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.993511 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jrnn\" (UniqueName: \"kubernetes.io/projected/b20bdd78-fe72-4ce9-b909-440d2e47e153-kube-api-access-5jrnn\") pod \"ssh-known-hosts-edpm-deployment-2glfk\" (UID: \"b20bdd78-fe72-4ce9-b909-440d2e47e153\") " pod="openstack/ssh-known-hosts-edpm-deployment-2glfk" Dec 01 09:44:31 crc kubenswrapper[4867]: I1201 09:44:31.993574 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b20bdd78-fe72-4ce9-b909-440d2e47e153-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2glfk\" (UID: \"b20bdd78-fe72-4ce9-b909-440d2e47e153\") " pod="openstack/ssh-known-hosts-edpm-deployment-2glfk" Dec 01 09:44:32 crc kubenswrapper[4867]: I1201 09:44:32.004956 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b20bdd78-fe72-4ce9-b909-440d2e47e153-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2glfk\" (UID: \"b20bdd78-fe72-4ce9-b909-440d2e47e153\") " pod="openstack/ssh-known-hosts-edpm-deployment-2glfk" Dec 01 09:44:32 crc kubenswrapper[4867]: I1201 09:44:32.005947 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b20bdd78-fe72-4ce9-b909-440d2e47e153-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2glfk\" (UID: \"b20bdd78-fe72-4ce9-b909-440d2e47e153\") " pod="openstack/ssh-known-hosts-edpm-deployment-2glfk" Dec 01 09:44:32 crc kubenswrapper[4867]: I1201 09:44:32.010663 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jrnn\" (UniqueName: \"kubernetes.io/projected/b20bdd78-fe72-4ce9-b909-440d2e47e153-kube-api-access-5jrnn\") pod \"ssh-known-hosts-edpm-deployment-2glfk\" (UID: \"b20bdd78-fe72-4ce9-b909-440d2e47e153\") " pod="openstack/ssh-known-hosts-edpm-deployment-2glfk" Dec 01 09:44:32 crc kubenswrapper[4867]: I1201 09:44:32.134333 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2glfk" Dec 01 09:44:32 crc kubenswrapper[4867]: I1201 09:44:32.663670 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2glfk"] Dec 01 09:44:32 crc kubenswrapper[4867]: I1201 09:44:32.731526 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2glfk" event={"ID":"b20bdd78-fe72-4ce9-b909-440d2e47e153","Type":"ContainerStarted","Data":"39dc15e4fefc9d7928ce04b9768303ee9ab2f258cbac6620bb550f47cc64230b"} Dec 01 09:44:32 crc kubenswrapper[4867]: I1201 09:44:32.731668 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kn57f" podUID="fa0aa429-09f2-49cf-8a90-e85a543511df" containerName="registry-server" containerID="cri-o://7277f19362784ff9d01ade71e2221f56bbf61851f77162d3e565677a71705b83" gracePeriod=2 Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.261311 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kn57f" Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.321386 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa0aa429-09f2-49cf-8a90-e85a543511df-utilities\") pod \"fa0aa429-09f2-49cf-8a90-e85a543511df\" (UID: \"fa0aa429-09f2-49cf-8a90-e85a543511df\") " Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.321574 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa0aa429-09f2-49cf-8a90-e85a543511df-catalog-content\") pod \"fa0aa429-09f2-49cf-8a90-e85a543511df\" (UID: \"fa0aa429-09f2-49cf-8a90-e85a543511df\") " Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.321689 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdkj4\" (UniqueName: \"kubernetes.io/projected/fa0aa429-09f2-49cf-8a90-e85a543511df-kube-api-access-cdkj4\") pod \"fa0aa429-09f2-49cf-8a90-e85a543511df\" (UID: \"fa0aa429-09f2-49cf-8a90-e85a543511df\") " Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.325735 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa0aa429-09f2-49cf-8a90-e85a543511df-utilities" (OuterVolumeSpecName: "utilities") pod "fa0aa429-09f2-49cf-8a90-e85a543511df" (UID: "fa0aa429-09f2-49cf-8a90-e85a543511df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.326396 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa0aa429-09f2-49cf-8a90-e85a543511df-kube-api-access-cdkj4" (OuterVolumeSpecName: "kube-api-access-cdkj4") pod "fa0aa429-09f2-49cf-8a90-e85a543511df" (UID: "fa0aa429-09f2-49cf-8a90-e85a543511df"). InnerVolumeSpecName "kube-api-access-cdkj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.423727 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdkj4\" (UniqueName: \"kubernetes.io/projected/fa0aa429-09f2-49cf-8a90-e85a543511df-kube-api-access-cdkj4\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.423760 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa0aa429-09f2-49cf-8a90-e85a543511df-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.441995 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa0aa429-09f2-49cf-8a90-e85a543511df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa0aa429-09f2-49cf-8a90-e85a543511df" (UID: "fa0aa429-09f2-49cf-8a90-e85a543511df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.526006 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa0aa429-09f2-49cf-8a90-e85a543511df-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.741922 4867 generic.go:334] "Generic (PLEG): container finished" podID="fa0aa429-09f2-49cf-8a90-e85a543511df" containerID="7277f19362784ff9d01ade71e2221f56bbf61851f77162d3e565677a71705b83" exitCode=0 Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.741985 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kn57f" event={"ID":"fa0aa429-09f2-49cf-8a90-e85a543511df","Type":"ContainerDied","Data":"7277f19362784ff9d01ade71e2221f56bbf61851f77162d3e565677a71705b83"} Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.742012 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kn57f" event={"ID":"fa0aa429-09f2-49cf-8a90-e85a543511df","Type":"ContainerDied","Data":"439b1170bcc6f98fa425e64ec836d978fbbfa5a3fec57afa5b565a3e5b71f722"} Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.742030 4867 scope.go:117] "RemoveContainer" containerID="7277f19362784ff9d01ade71e2221f56bbf61851f77162d3e565677a71705b83" Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.742170 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kn57f" Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.746660 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2glfk" event={"ID":"b20bdd78-fe72-4ce9-b909-440d2e47e153","Type":"ContainerStarted","Data":"9ec2e0f2d2b8e55e802347e9aedbffc7b893fb6f35c9411b128c4b6d92cd0a3b"} Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.762119 4867 scope.go:117] "RemoveContainer" containerID="c959864e4badd8349ecdd195ad00e2d21adade16cd3ee0c34d59301146cf140c" Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.787296 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-2glfk" podStartSLOduration=2.603261891 podStartE2EDuration="2.787280791s" podCreationTimestamp="2025-12-01 09:44:31 +0000 UTC" firstStartedPulling="2025-12-01 09:44:32.662993388 +0000 UTC m=+2194.122380142" lastFinishedPulling="2025-12-01 09:44:32.847012288 +0000 UTC m=+2194.306399042" observedRunningTime="2025-12-01 09:44:33.785204484 +0000 UTC m=+2195.244591228" watchObservedRunningTime="2025-12-01 09:44:33.787280791 +0000 UTC m=+2195.246667545" Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.796046 4867 scope.go:117] "RemoveContainer" containerID="87c3bca540eafe82cae5197de6d6afbf04e73c4d8b30a054d8d2c37ab4c36fe3" Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.814880 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kn57f"] Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.818907 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kn57f"] Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.841243 4867 scope.go:117] "RemoveContainer" containerID="7277f19362784ff9d01ade71e2221f56bbf61851f77162d3e565677a71705b83" Dec 01 09:44:33 crc kubenswrapper[4867]: E1201 09:44:33.841604 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7277f19362784ff9d01ade71e2221f56bbf61851f77162d3e565677a71705b83\": container with ID starting with 7277f19362784ff9d01ade71e2221f56bbf61851f77162d3e565677a71705b83 not found: ID does not exist" containerID="7277f19362784ff9d01ade71e2221f56bbf61851f77162d3e565677a71705b83" Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.841631 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7277f19362784ff9d01ade71e2221f56bbf61851f77162d3e565677a71705b83"} err="failed to get container status \"7277f19362784ff9d01ade71e2221f56bbf61851f77162d3e565677a71705b83\": rpc error: code = NotFound desc = could not find container \"7277f19362784ff9d01ade71e2221f56bbf61851f77162d3e565677a71705b83\": container with ID starting with 7277f19362784ff9d01ade71e2221f56bbf61851f77162d3e565677a71705b83 not found: ID does not exist" Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.841651 4867 scope.go:117] "RemoveContainer" containerID="c959864e4badd8349ecdd195ad00e2d21adade16cd3ee0c34d59301146cf140c" Dec 01 09:44:33 crc kubenswrapper[4867]: E1201 09:44:33.841888 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c959864e4badd8349ecdd195ad00e2d21adade16cd3ee0c34d59301146cf140c\": container with ID starting with c959864e4badd8349ecdd195ad00e2d21adade16cd3ee0c34d59301146cf140c not found: ID does not exist" containerID="c959864e4badd8349ecdd195ad00e2d21adade16cd3ee0c34d59301146cf140c" Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.841931 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c959864e4badd8349ecdd195ad00e2d21adade16cd3ee0c34d59301146cf140c"} err="failed to get container status \"c959864e4badd8349ecdd195ad00e2d21adade16cd3ee0c34d59301146cf140c\": rpc error: code = NotFound desc = could not find container \"c959864e4badd8349ecdd195ad00e2d21adade16cd3ee0c34d59301146cf140c\": container with ID starting with c959864e4badd8349ecdd195ad00e2d21adade16cd3ee0c34d59301146cf140c not found: ID does not exist" Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.841946 4867 scope.go:117] "RemoveContainer" containerID="87c3bca540eafe82cae5197de6d6afbf04e73c4d8b30a054d8d2c37ab4c36fe3" Dec 01 09:44:33 crc kubenswrapper[4867]: E1201 09:44:33.842604 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87c3bca540eafe82cae5197de6d6afbf04e73c4d8b30a054d8d2c37ab4c36fe3\": container with ID starting with 87c3bca540eafe82cae5197de6d6afbf04e73c4d8b30a054d8d2c37ab4c36fe3 not found: ID does not exist" containerID="87c3bca540eafe82cae5197de6d6afbf04e73c4d8b30a054d8d2c37ab4c36fe3" Dec 01 09:44:33 crc kubenswrapper[4867]: I1201 09:44:33.842645 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c3bca540eafe82cae5197de6d6afbf04e73c4d8b30a054d8d2c37ab4c36fe3"} err="failed to get container status \"87c3bca540eafe82cae5197de6d6afbf04e73c4d8b30a054d8d2c37ab4c36fe3\": rpc error: code = NotFound desc = could not find container \"87c3bca540eafe82cae5197de6d6afbf04e73c4d8b30a054d8d2c37ab4c36fe3\": container with ID starting with 87c3bca540eafe82cae5197de6d6afbf04e73c4d8b30a054d8d2c37ab4c36fe3 not found: ID does not exist" Dec 01 09:44:34 crc kubenswrapper[4867]: I1201 09:44:34.839158 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa0aa429-09f2-49cf-8a90-e85a543511df" path="/var/lib/kubelet/pods/fa0aa429-09f2-49cf-8a90-e85a543511df/volumes" Dec 01 09:44:42 crc kubenswrapper[4867]: I1201 09:44:42.821329 4867 generic.go:334] "Generic (PLEG): container finished" podID="b20bdd78-fe72-4ce9-b909-440d2e47e153" containerID="9ec2e0f2d2b8e55e802347e9aedbffc7b893fb6f35c9411b128c4b6d92cd0a3b" exitCode=0 Dec 01 09:44:42 crc kubenswrapper[4867]: I1201 09:44:42.821836 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2glfk" event={"ID":"b20bdd78-fe72-4ce9-b909-440d2e47e153","Type":"ContainerDied","Data":"9ec2e0f2d2b8e55e802347e9aedbffc7b893fb6f35c9411b128c4b6d92cd0a3b"} Dec 01 09:44:44 crc kubenswrapper[4867]: I1201 09:44:44.233847 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2glfk" Dec 01 09:44:44 crc kubenswrapper[4867]: I1201 09:44:44.428053 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jrnn\" (UniqueName: \"kubernetes.io/projected/b20bdd78-fe72-4ce9-b909-440d2e47e153-kube-api-access-5jrnn\") pod \"b20bdd78-fe72-4ce9-b909-440d2e47e153\" (UID: \"b20bdd78-fe72-4ce9-b909-440d2e47e153\") " Dec 01 09:44:44 crc kubenswrapper[4867]: I1201 09:44:44.428210 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b20bdd78-fe72-4ce9-b909-440d2e47e153-inventory-0\") pod \"b20bdd78-fe72-4ce9-b909-440d2e47e153\" (UID: \"b20bdd78-fe72-4ce9-b909-440d2e47e153\") " Dec 01 09:44:44 crc kubenswrapper[4867]: I1201 09:44:44.428266 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b20bdd78-fe72-4ce9-b909-440d2e47e153-ssh-key-openstack-edpm-ipam\") pod \"b20bdd78-fe72-4ce9-b909-440d2e47e153\" (UID: \"b20bdd78-fe72-4ce9-b909-440d2e47e153\") " Dec 01 09:44:44 crc kubenswrapper[4867]: I1201 09:44:44.433564 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b20bdd78-fe72-4ce9-b909-440d2e47e153-kube-api-access-5jrnn" (OuterVolumeSpecName: "kube-api-access-5jrnn") pod "b20bdd78-fe72-4ce9-b909-440d2e47e153" (UID: "b20bdd78-fe72-4ce9-b909-440d2e47e153"). InnerVolumeSpecName "kube-api-access-5jrnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:44:44 crc kubenswrapper[4867]: E1201 09:44:44.454645 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b20bdd78-fe72-4ce9-b909-440d2e47e153-inventory-0 podName:b20bdd78-fe72-4ce9-b909-440d2e47e153 nodeName:}" failed. No retries permitted until 2025-12-01 09:44:44.954464138 +0000 UTC m=+2206.413850892 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory-0" (UniqueName: "kubernetes.io/secret/b20bdd78-fe72-4ce9-b909-440d2e47e153-inventory-0") pod "b20bdd78-fe72-4ce9-b909-440d2e47e153" (UID: "b20bdd78-fe72-4ce9-b909-440d2e47e153") : error deleting /var/lib/kubelet/pods/b20bdd78-fe72-4ce9-b909-440d2e47e153/volume-subpaths: remove /var/lib/kubelet/pods/b20bdd78-fe72-4ce9-b909-440d2e47e153/volume-subpaths: no such file or directory Dec 01 09:44:44 crc kubenswrapper[4867]: I1201 09:44:44.457012 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20bdd78-fe72-4ce9-b909-440d2e47e153-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b20bdd78-fe72-4ce9-b909-440d2e47e153" (UID: "b20bdd78-fe72-4ce9-b909-440d2e47e153"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:44:44 crc kubenswrapper[4867]: I1201 09:44:44.530555 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b20bdd78-fe72-4ce9-b909-440d2e47e153-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:44 crc kubenswrapper[4867]: I1201 09:44:44.530584 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jrnn\" (UniqueName: \"kubernetes.io/projected/b20bdd78-fe72-4ce9-b909-440d2e47e153-kube-api-access-5jrnn\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:44 crc kubenswrapper[4867]: I1201 09:44:44.841167 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2glfk" Dec 01 09:44:44 crc kubenswrapper[4867]: I1201 09:44:44.842469 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2glfk" event={"ID":"b20bdd78-fe72-4ce9-b909-440d2e47e153","Type":"ContainerDied","Data":"39dc15e4fefc9d7928ce04b9768303ee9ab2f258cbac6620bb550f47cc64230b"} Dec 01 09:44:44 crc kubenswrapper[4867]: I1201 09:44:44.842511 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39dc15e4fefc9d7928ce04b9768303ee9ab2f258cbac6620bb550f47cc64230b" Dec 01 09:44:44 crc kubenswrapper[4867]: I1201 09:44:44.919278 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785"] Dec 01 09:44:44 crc kubenswrapper[4867]: E1201 09:44:44.919623 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa0aa429-09f2-49cf-8a90-e85a543511df" containerName="extract-content" Dec 01 09:44:44 crc kubenswrapper[4867]: I1201 09:44:44.919637 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa0aa429-09f2-49cf-8a90-e85a543511df" containerName="extract-content" Dec 01 09:44:44 crc kubenswrapper[4867]: E1201 09:44:44.919666 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa0aa429-09f2-49cf-8a90-e85a543511df" containerName="registry-server" Dec 01 09:44:44 crc kubenswrapper[4867]: I1201 09:44:44.919674 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa0aa429-09f2-49cf-8a90-e85a543511df" containerName="registry-server" Dec 01 09:44:44 crc kubenswrapper[4867]: E1201 09:44:44.919692 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa0aa429-09f2-49cf-8a90-e85a543511df" containerName="extract-utilities" Dec 01 09:44:44 crc kubenswrapper[4867]: I1201 09:44:44.919699 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa0aa429-09f2-49cf-8a90-e85a543511df" containerName="extract-utilities" Dec 01 09:44:44 crc kubenswrapper[4867]: E1201 09:44:44.919715 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b20bdd78-fe72-4ce9-b909-440d2e47e153" containerName="ssh-known-hosts-edpm-deployment" Dec 01 09:44:44 crc kubenswrapper[4867]: I1201 09:44:44.919721 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20bdd78-fe72-4ce9-b909-440d2e47e153" containerName="ssh-known-hosts-edpm-deployment" Dec 01 09:44:44 crc kubenswrapper[4867]: I1201 09:44:44.919909 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b20bdd78-fe72-4ce9-b909-440d2e47e153" containerName="ssh-known-hosts-edpm-deployment" Dec 01 09:44:44 crc kubenswrapper[4867]: I1201 09:44:44.919921 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa0aa429-09f2-49cf-8a90-e85a543511df" containerName="registry-server" Dec 01 09:44:44 crc kubenswrapper[4867]: I1201 09:44:44.920560 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785" Dec 01 09:44:44 crc kubenswrapper[4867]: I1201 09:44:44.933453 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785"] Dec 01 09:44:45 crc kubenswrapper[4867]: I1201 09:44:45.037627 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b20bdd78-fe72-4ce9-b909-440d2e47e153-inventory-0\") pod \"b20bdd78-fe72-4ce9-b909-440d2e47e153\" (UID: \"b20bdd78-fe72-4ce9-b909-440d2e47e153\") " Dec 01 09:44:45 crc kubenswrapper[4867]: I1201 09:44:45.038075 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/612c2304-16fe-4932-824d-6116da3a4fb8-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gp785\" (UID: \"612c2304-16fe-4932-824d-6116da3a4fb8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785" Dec 01 09:44:45 crc kubenswrapper[4867]: I1201 09:44:45.038132 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/612c2304-16fe-4932-824d-6116da3a4fb8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gp785\" (UID: \"612c2304-16fe-4932-824d-6116da3a4fb8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785" Dec 01 09:44:45 crc kubenswrapper[4867]: I1201 09:44:45.038209 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvfmv\" (UniqueName: \"kubernetes.io/projected/612c2304-16fe-4932-824d-6116da3a4fb8-kube-api-access-vvfmv\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gp785\" (UID: \"612c2304-16fe-4932-824d-6116da3a4fb8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785" Dec 01 09:44:45 crc kubenswrapper[4867]: I1201 09:44:45.043941 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20bdd78-fe72-4ce9-b909-440d2e47e153-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "b20bdd78-fe72-4ce9-b909-440d2e47e153" (UID: "b20bdd78-fe72-4ce9-b909-440d2e47e153"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:44:45 crc kubenswrapper[4867]: I1201 09:44:45.139759 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvfmv\" (UniqueName: \"kubernetes.io/projected/612c2304-16fe-4932-824d-6116da3a4fb8-kube-api-access-vvfmv\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gp785\" (UID: \"612c2304-16fe-4932-824d-6116da3a4fb8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785" Dec 01 09:44:45 crc kubenswrapper[4867]: I1201 09:44:45.139876 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/612c2304-16fe-4932-824d-6116da3a4fb8-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gp785\" (UID: \"612c2304-16fe-4932-824d-6116da3a4fb8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785" Dec 01 09:44:45 crc kubenswrapper[4867]: I1201 09:44:45.139952 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/612c2304-16fe-4932-824d-6116da3a4fb8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gp785\" (UID: \"612c2304-16fe-4932-824d-6116da3a4fb8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785" Dec 01 09:44:45 crc kubenswrapper[4867]: I1201 09:44:45.140039 4867 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b20bdd78-fe72-4ce9-b909-440d2e47e153-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:45 crc kubenswrapper[4867]: I1201 09:44:45.144083 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/612c2304-16fe-4932-824d-6116da3a4fb8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gp785\" (UID: \"612c2304-16fe-4932-824d-6116da3a4fb8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785" Dec 01 09:44:45 crc kubenswrapper[4867]: I1201 09:44:45.144295 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/612c2304-16fe-4932-824d-6116da3a4fb8-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gp785\" (UID: \"612c2304-16fe-4932-824d-6116da3a4fb8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785" Dec 01 09:44:45 crc kubenswrapper[4867]: I1201 09:44:45.155956 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvfmv\" (UniqueName: \"kubernetes.io/projected/612c2304-16fe-4932-824d-6116da3a4fb8-kube-api-access-vvfmv\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gp785\" (UID: \"612c2304-16fe-4932-824d-6116da3a4fb8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785" Dec 01 09:44:45 crc kubenswrapper[4867]: I1201 09:44:45.236856 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785" Dec 01 09:44:45 crc kubenswrapper[4867]: I1201 09:44:45.789238 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785"] Dec 01 09:44:45 crc kubenswrapper[4867]: W1201 09:44:45.793202 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod612c2304_16fe_4932_824d_6116da3a4fb8.slice/crio-5a4f47d0860ca7a1daba490c6bb6a33c77bdfd4726f8bcad7a6185ce9c173ede WatchSource:0}: Error finding container 5a4f47d0860ca7a1daba490c6bb6a33c77bdfd4726f8bcad7a6185ce9c173ede: Status 404 returned error can't find the container with id 5a4f47d0860ca7a1daba490c6bb6a33c77bdfd4726f8bcad7a6185ce9c173ede Dec 01 09:44:45 crc kubenswrapper[4867]: I1201 09:44:45.851373 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785" event={"ID":"612c2304-16fe-4932-824d-6116da3a4fb8","Type":"ContainerStarted","Data":"5a4f47d0860ca7a1daba490c6bb6a33c77bdfd4726f8bcad7a6185ce9c173ede"} Dec 01 09:44:46 crc kubenswrapper[4867]: I1201 09:44:46.871487 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785" event={"ID":"612c2304-16fe-4932-824d-6116da3a4fb8","Type":"ContainerStarted","Data":"669263b12818c0462865ee05aabf3072c79eb772b42279a170d5e4c7e27258ce"} Dec 01 09:44:46 crc kubenswrapper[4867]: I1201 09:44:46.892121 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785" podStartSLOduration=2.7019900359999998 podStartE2EDuration="2.89200362s" podCreationTimestamp="2025-12-01 09:44:44 +0000 UTC" firstStartedPulling="2025-12-01 09:44:45.79550131 +0000 UTC m=+2207.254888064" lastFinishedPulling="2025-12-01 09:44:45.985514894 +0000 UTC m=+2207.444901648" observedRunningTime="2025-12-01 09:44:46.891665211 +0000 UTC m=+2208.351051985" watchObservedRunningTime="2025-12-01 09:44:46.89200362 +0000 UTC m=+2208.351390384" Dec 01 09:44:51 crc kubenswrapper[4867]: I1201 09:44:51.601301 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:44:51 crc kubenswrapper[4867]: I1201 09:44:51.601941 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:44:54 crc kubenswrapper[4867]: I1201 09:44:54.936951 4867 generic.go:334] "Generic (PLEG): container finished" podID="612c2304-16fe-4932-824d-6116da3a4fb8" containerID="669263b12818c0462865ee05aabf3072c79eb772b42279a170d5e4c7e27258ce" exitCode=0 Dec 01 09:44:54 crc kubenswrapper[4867]: I1201 09:44:54.937009 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785" event={"ID":"612c2304-16fe-4932-824d-6116da3a4fb8","Type":"ContainerDied","Data":"669263b12818c0462865ee05aabf3072c79eb772b42279a170d5e4c7e27258ce"} Dec 01 09:44:56 crc kubenswrapper[4867]: I1201 09:44:56.454100 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785" Dec 01 09:44:56 crc kubenswrapper[4867]: I1201 09:44:56.467261 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/612c2304-16fe-4932-824d-6116da3a4fb8-inventory\") pod \"612c2304-16fe-4932-824d-6116da3a4fb8\" (UID: \"612c2304-16fe-4932-824d-6116da3a4fb8\") " Dec 01 09:44:56 crc kubenswrapper[4867]: I1201 09:44:56.467332 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/612c2304-16fe-4932-824d-6116da3a4fb8-ssh-key\") pod \"612c2304-16fe-4932-824d-6116da3a4fb8\" (UID: \"612c2304-16fe-4932-824d-6116da3a4fb8\") " Dec 01 09:44:56 crc kubenswrapper[4867]: I1201 09:44:56.467684 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvfmv\" (UniqueName: \"kubernetes.io/projected/612c2304-16fe-4932-824d-6116da3a4fb8-kube-api-access-vvfmv\") pod \"612c2304-16fe-4932-824d-6116da3a4fb8\" (UID: \"612c2304-16fe-4932-824d-6116da3a4fb8\") " Dec 01 09:44:56 crc kubenswrapper[4867]: I1201 09:44:56.475182 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/612c2304-16fe-4932-824d-6116da3a4fb8-kube-api-access-vvfmv" (OuterVolumeSpecName: "kube-api-access-vvfmv") pod "612c2304-16fe-4932-824d-6116da3a4fb8" (UID: "612c2304-16fe-4932-824d-6116da3a4fb8"). InnerVolumeSpecName "kube-api-access-vvfmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:44:56 crc kubenswrapper[4867]: I1201 09:44:56.527120 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/612c2304-16fe-4932-824d-6116da3a4fb8-inventory" (OuterVolumeSpecName: "inventory") pod "612c2304-16fe-4932-824d-6116da3a4fb8" (UID: "612c2304-16fe-4932-824d-6116da3a4fb8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:44:56 crc kubenswrapper[4867]: I1201 09:44:56.538010 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/612c2304-16fe-4932-824d-6116da3a4fb8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "612c2304-16fe-4932-824d-6116da3a4fb8" (UID: "612c2304-16fe-4932-824d-6116da3a4fb8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:44:56 crc kubenswrapper[4867]: I1201 09:44:56.570322 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvfmv\" (UniqueName: \"kubernetes.io/projected/612c2304-16fe-4932-824d-6116da3a4fb8-kube-api-access-vvfmv\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:56 crc kubenswrapper[4867]: I1201 09:44:56.570360 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/612c2304-16fe-4932-824d-6116da3a4fb8-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:56 crc kubenswrapper[4867]: I1201 09:44:56.570371 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/612c2304-16fe-4932-824d-6116da3a4fb8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:56 crc kubenswrapper[4867]: I1201 09:44:56.957446 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785" event={"ID":"612c2304-16fe-4932-824d-6116da3a4fb8","Type":"ContainerDied","Data":"5a4f47d0860ca7a1daba490c6bb6a33c77bdfd4726f8bcad7a6185ce9c173ede"} Dec 01 09:44:56 crc kubenswrapper[4867]: I1201 09:44:56.957500 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a4f47d0860ca7a1daba490c6bb6a33c77bdfd4726f8bcad7a6185ce9c173ede" Dec 01 09:44:56 crc kubenswrapper[4867]: I1201 09:44:56.957553 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gp785" Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.044887 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k"] Dec 01 09:44:57 crc kubenswrapper[4867]: E1201 09:44:57.045607 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="612c2304-16fe-4932-824d-6116da3a4fb8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.045628 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="612c2304-16fe-4932-824d-6116da3a4fb8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.047291 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="612c2304-16fe-4932-824d-6116da3a4fb8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.047923 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k" Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.050860 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.054296 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zvcpg" Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.054424 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.055704 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.075544 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k"] Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.080383 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gmt2\" (UniqueName: \"kubernetes.io/projected/4ae71b24-0c2d-46fa-a8e4-3fb8261f6817-kube-api-access-2gmt2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k\" (UID: \"4ae71b24-0c2d-46fa-a8e4-3fb8261f6817\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k" Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.080508 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ae71b24-0c2d-46fa-a8e4-3fb8261f6817-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k\" (UID: \"4ae71b24-0c2d-46fa-a8e4-3fb8261f6817\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k" Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.080566 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ae71b24-0c2d-46fa-a8e4-3fb8261f6817-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k\" (UID: \"4ae71b24-0c2d-46fa-a8e4-3fb8261f6817\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k" Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.183109 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gmt2\" (UniqueName: \"kubernetes.io/projected/4ae71b24-0c2d-46fa-a8e4-3fb8261f6817-kube-api-access-2gmt2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k\" (UID: \"4ae71b24-0c2d-46fa-a8e4-3fb8261f6817\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k" Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.183274 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ae71b24-0c2d-46fa-a8e4-3fb8261f6817-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k\" (UID: \"4ae71b24-0c2d-46fa-a8e4-3fb8261f6817\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k" Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.183360 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ae71b24-0c2d-46fa-a8e4-3fb8261f6817-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k\" (UID: \"4ae71b24-0c2d-46fa-a8e4-3fb8261f6817\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k" Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.188373 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ae71b24-0c2d-46fa-a8e4-3fb8261f6817-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k\" (UID: \"4ae71b24-0c2d-46fa-a8e4-3fb8261f6817\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k" Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.189758 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ae71b24-0c2d-46fa-a8e4-3fb8261f6817-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k\" (UID: \"4ae71b24-0c2d-46fa-a8e4-3fb8261f6817\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k" Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.198994 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gmt2\" (UniqueName: \"kubernetes.io/projected/4ae71b24-0c2d-46fa-a8e4-3fb8261f6817-kube-api-access-2gmt2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k\" (UID: \"4ae71b24-0c2d-46fa-a8e4-3fb8261f6817\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k" Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.365714 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k" Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.891646 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k"] Dec 01 09:44:57 crc kubenswrapper[4867]: I1201 09:44:57.969581 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k" event={"ID":"4ae71b24-0c2d-46fa-a8e4-3fb8261f6817","Type":"ContainerStarted","Data":"04114f6d2e2a3a456e90cad952fb4a7a8a3c1bf264f7e53f15cccd9f9a3d90eb"} Dec 01 09:44:58 crc kubenswrapper[4867]: I1201 09:44:58.978381 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k" event={"ID":"4ae71b24-0c2d-46fa-a8e4-3fb8261f6817","Type":"ContainerStarted","Data":"b691766eb7ca1f86e08ef922c263191bbb71790d41fec29c3e0644f699546134"} Dec 01 09:44:59 crc kubenswrapper[4867]: I1201 09:44:59.003441 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k" podStartSLOduration=1.863476441 podStartE2EDuration="2.003406701s" podCreationTimestamp="2025-12-01 09:44:57 +0000 UTC" firstStartedPulling="2025-12-01 09:44:57.892962237 +0000 UTC m=+2219.352348991" lastFinishedPulling="2025-12-01 09:44:58.032892497 +0000 UTC m=+2219.492279251" observedRunningTime="2025-12-01 09:44:59.003186215 +0000 UTC m=+2220.462572999" watchObservedRunningTime="2025-12-01 09:44:59.003406701 +0000 UTC m=+2220.462793455" Dec 01 09:45:00 crc kubenswrapper[4867]: I1201 09:45:00.145611 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t"] Dec 01 09:45:00 crc kubenswrapper[4867]: I1201 09:45:00.147954 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t" Dec 01 09:45:00 crc kubenswrapper[4867]: I1201 09:45:00.150491 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 09:45:00 crc kubenswrapper[4867]: I1201 09:45:00.150508 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 09:45:00 crc kubenswrapper[4867]: I1201 09:45:00.169913 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t"] Dec 01 09:45:00 crc kubenswrapper[4867]: I1201 09:45:00.238433 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwj8w\" (UniqueName: \"kubernetes.io/projected/076005a6-715b-44d7-9162-423fc55eb31d-kube-api-access-pwj8w\") pod \"collect-profiles-29409705-6w67t\" (UID: \"076005a6-715b-44d7-9162-423fc55eb31d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t" Dec 01 09:45:00 crc kubenswrapper[4867]: I1201 09:45:00.238481 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/076005a6-715b-44d7-9162-423fc55eb31d-config-volume\") pod \"collect-profiles-29409705-6w67t\" (UID: \"076005a6-715b-44d7-9162-423fc55eb31d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t" Dec 01 09:45:00 crc kubenswrapper[4867]: I1201 09:45:00.238588 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/076005a6-715b-44d7-9162-423fc55eb31d-secret-volume\") pod \"collect-profiles-29409705-6w67t\" (UID: \"076005a6-715b-44d7-9162-423fc55eb31d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t" Dec 01 09:45:00 crc kubenswrapper[4867]: I1201 09:45:00.341023 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwj8w\" (UniqueName: \"kubernetes.io/projected/076005a6-715b-44d7-9162-423fc55eb31d-kube-api-access-pwj8w\") pod \"collect-profiles-29409705-6w67t\" (UID: \"076005a6-715b-44d7-9162-423fc55eb31d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t" Dec 01 09:45:00 crc kubenswrapper[4867]: I1201 09:45:00.341079 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/076005a6-715b-44d7-9162-423fc55eb31d-config-volume\") pod \"collect-profiles-29409705-6w67t\" (UID: \"076005a6-715b-44d7-9162-423fc55eb31d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t" Dec 01 09:45:00 crc kubenswrapper[4867]: I1201 09:45:00.341212 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/076005a6-715b-44d7-9162-423fc55eb31d-secret-volume\") pod \"collect-profiles-29409705-6w67t\" (UID: \"076005a6-715b-44d7-9162-423fc55eb31d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t" Dec 01 09:45:00 crc kubenswrapper[4867]: I1201 09:45:00.342040 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/076005a6-715b-44d7-9162-423fc55eb31d-config-volume\") pod \"collect-profiles-29409705-6w67t\" (UID: \"076005a6-715b-44d7-9162-423fc55eb31d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t" Dec 01 09:45:00 crc kubenswrapper[4867]: I1201 09:45:00.349744 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/076005a6-715b-44d7-9162-423fc55eb31d-secret-volume\") pod \"collect-profiles-29409705-6w67t\" (UID: \"076005a6-715b-44d7-9162-423fc55eb31d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t" Dec 01 09:45:00 crc kubenswrapper[4867]: I1201 09:45:00.361779 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwj8w\" (UniqueName: \"kubernetes.io/projected/076005a6-715b-44d7-9162-423fc55eb31d-kube-api-access-pwj8w\") pod \"collect-profiles-29409705-6w67t\" (UID: \"076005a6-715b-44d7-9162-423fc55eb31d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t" Dec 01 09:45:00 crc kubenswrapper[4867]: I1201 09:45:00.470491 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t" Dec 01 09:45:00 crc kubenswrapper[4867]: I1201 09:45:00.975339 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t"] Dec 01 09:45:01 crc kubenswrapper[4867]: I1201 09:45:01.000426 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t" event={"ID":"076005a6-715b-44d7-9162-423fc55eb31d","Type":"ContainerStarted","Data":"66663d2be06ccd124831fd267bde883a72c464709b1c14f49e5c3367293f18ba"} Dec 01 09:45:02 crc kubenswrapper[4867]: I1201 09:45:02.009143 4867 generic.go:334] "Generic (PLEG): container finished" podID="076005a6-715b-44d7-9162-423fc55eb31d" containerID="1354c2e646aa137f7bf8a4e3b810ae84b8997b03427351a0bed35aef4fe9d328" exitCode=0 Dec 01 09:45:02 crc kubenswrapper[4867]: I1201 09:45:02.009303 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t" event={"ID":"076005a6-715b-44d7-9162-423fc55eb31d","Type":"ContainerDied","Data":"1354c2e646aa137f7bf8a4e3b810ae84b8997b03427351a0bed35aef4fe9d328"} Dec 01 09:45:03 crc kubenswrapper[4867]: I1201 09:45:03.332172 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t" Dec 01 09:45:03 crc kubenswrapper[4867]: I1201 09:45:03.431148 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwj8w\" (UniqueName: \"kubernetes.io/projected/076005a6-715b-44d7-9162-423fc55eb31d-kube-api-access-pwj8w\") pod \"076005a6-715b-44d7-9162-423fc55eb31d\" (UID: \"076005a6-715b-44d7-9162-423fc55eb31d\") " Dec 01 09:45:03 crc kubenswrapper[4867]: I1201 09:45:03.431376 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/076005a6-715b-44d7-9162-423fc55eb31d-config-volume\") pod \"076005a6-715b-44d7-9162-423fc55eb31d\" (UID: \"076005a6-715b-44d7-9162-423fc55eb31d\") " Dec 01 09:45:03 crc kubenswrapper[4867]: I1201 09:45:03.431556 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/076005a6-715b-44d7-9162-423fc55eb31d-secret-volume\") pod \"076005a6-715b-44d7-9162-423fc55eb31d\" (UID: \"076005a6-715b-44d7-9162-423fc55eb31d\") " Dec 01 09:45:03 crc kubenswrapper[4867]: I1201 09:45:03.432744 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/076005a6-715b-44d7-9162-423fc55eb31d-config-volume" (OuterVolumeSpecName: "config-volume") pod "076005a6-715b-44d7-9162-423fc55eb31d" (UID: "076005a6-715b-44d7-9162-423fc55eb31d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:45:03 crc kubenswrapper[4867]: I1201 09:45:03.436792 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076005a6-715b-44d7-9162-423fc55eb31d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "076005a6-715b-44d7-9162-423fc55eb31d" (UID: "076005a6-715b-44d7-9162-423fc55eb31d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:03 crc kubenswrapper[4867]: I1201 09:45:03.444938 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/076005a6-715b-44d7-9162-423fc55eb31d-kube-api-access-pwj8w" (OuterVolumeSpecName: "kube-api-access-pwj8w") pod "076005a6-715b-44d7-9162-423fc55eb31d" (UID: "076005a6-715b-44d7-9162-423fc55eb31d"). InnerVolumeSpecName "kube-api-access-pwj8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:45:03 crc kubenswrapper[4867]: I1201 09:45:03.532843 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwj8w\" (UniqueName: \"kubernetes.io/projected/076005a6-715b-44d7-9162-423fc55eb31d-kube-api-access-pwj8w\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:03 crc kubenswrapper[4867]: I1201 09:45:03.532873 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/076005a6-715b-44d7-9162-423fc55eb31d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:03 crc kubenswrapper[4867]: I1201 09:45:03.532881 4867 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/076005a6-715b-44d7-9162-423fc55eb31d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:04 crc kubenswrapper[4867]: I1201 09:45:04.027547 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t" event={"ID":"076005a6-715b-44d7-9162-423fc55eb31d","Type":"ContainerDied","Data":"66663d2be06ccd124831fd267bde883a72c464709b1c14f49e5c3367293f18ba"} Dec 01 09:45:04 crc kubenswrapper[4867]: I1201 09:45:04.027588 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66663d2be06ccd124831fd267bde883a72c464709b1c14f49e5c3367293f18ba" Dec 01 09:45:04 crc kubenswrapper[4867]: I1201 09:45:04.027607 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t" Dec 01 09:45:04 crc kubenswrapper[4867]: I1201 09:45:04.423507 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h"] Dec 01 09:45:04 crc kubenswrapper[4867]: I1201 09:45:04.431090 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409660-lcc6h"] Dec 01 09:45:04 crc kubenswrapper[4867]: I1201 09:45:04.842278 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45e5dcb3-d55f-40cf-a89f-3367e84322d1" path="/var/lib/kubelet/pods/45e5dcb3-d55f-40cf-a89f-3367e84322d1/volumes" Dec 01 09:45:11 crc kubenswrapper[4867]: I1201 09:45:11.091207 4867 generic.go:334] "Generic (PLEG): container finished" podID="4ae71b24-0c2d-46fa-a8e4-3fb8261f6817" containerID="b691766eb7ca1f86e08ef922c263191bbb71790d41fec29c3e0644f699546134" exitCode=0 Dec 01 09:45:11 crc kubenswrapper[4867]: I1201 09:45:11.091313 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k" event={"ID":"4ae71b24-0c2d-46fa-a8e4-3fb8261f6817","Type":"ContainerDied","Data":"b691766eb7ca1f86e08ef922c263191bbb71790d41fec29c3e0644f699546134"} Dec 01 09:45:12 crc kubenswrapper[4867]: I1201 09:45:12.512580 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k" Dec 01 09:45:12 crc kubenswrapper[4867]: I1201 09:45:12.603572 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ae71b24-0c2d-46fa-a8e4-3fb8261f6817-inventory\") pod \"4ae71b24-0c2d-46fa-a8e4-3fb8261f6817\" (UID: \"4ae71b24-0c2d-46fa-a8e4-3fb8261f6817\") " Dec 01 09:45:12 crc kubenswrapper[4867]: I1201 09:45:12.603944 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ae71b24-0c2d-46fa-a8e4-3fb8261f6817-ssh-key\") pod \"4ae71b24-0c2d-46fa-a8e4-3fb8261f6817\" (UID: \"4ae71b24-0c2d-46fa-a8e4-3fb8261f6817\") " Dec 01 09:45:12 crc kubenswrapper[4867]: I1201 09:45:12.603970 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gmt2\" (UniqueName: \"kubernetes.io/projected/4ae71b24-0c2d-46fa-a8e4-3fb8261f6817-kube-api-access-2gmt2\") pod \"4ae71b24-0c2d-46fa-a8e4-3fb8261f6817\" (UID: \"4ae71b24-0c2d-46fa-a8e4-3fb8261f6817\") " Dec 01 09:45:12 crc kubenswrapper[4867]: I1201 09:45:12.632370 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae71b24-0c2d-46fa-a8e4-3fb8261f6817-kube-api-access-2gmt2" (OuterVolumeSpecName: "kube-api-access-2gmt2") pod "4ae71b24-0c2d-46fa-a8e4-3fb8261f6817" (UID: "4ae71b24-0c2d-46fa-a8e4-3fb8261f6817"). InnerVolumeSpecName "kube-api-access-2gmt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:45:12 crc kubenswrapper[4867]: I1201 09:45:12.633917 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ae71b24-0c2d-46fa-a8e4-3fb8261f6817-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4ae71b24-0c2d-46fa-a8e4-3fb8261f6817" (UID: "4ae71b24-0c2d-46fa-a8e4-3fb8261f6817"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:12 crc kubenswrapper[4867]: I1201 09:45:12.667907 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ae71b24-0c2d-46fa-a8e4-3fb8261f6817-inventory" (OuterVolumeSpecName: "inventory") pod "4ae71b24-0c2d-46fa-a8e4-3fb8261f6817" (UID: "4ae71b24-0c2d-46fa-a8e4-3fb8261f6817"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:12 crc kubenswrapper[4867]: I1201 09:45:12.709352 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ae71b24-0c2d-46fa-a8e4-3fb8261f6817-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:12 crc kubenswrapper[4867]: I1201 09:45:12.709382 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ae71b24-0c2d-46fa-a8e4-3fb8261f6817-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:12 crc kubenswrapper[4867]: I1201 09:45:12.709392 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gmt2\" (UniqueName: \"kubernetes.io/projected/4ae71b24-0c2d-46fa-a8e4-3fb8261f6817-kube-api-access-2gmt2\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.194649 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k" event={"ID":"4ae71b24-0c2d-46fa-a8e4-3fb8261f6817","Type":"ContainerDied","Data":"04114f6d2e2a3a456e90cad952fb4a7a8a3c1bf264f7e53f15cccd9f9a3d90eb"} Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.194691 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04114f6d2e2a3a456e90cad952fb4a7a8a3c1bf264f7e53f15cccd9f9a3d90eb" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.194762 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.263296 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49"] Dec 01 09:45:13 crc kubenswrapper[4867]: E1201 09:45:13.264062 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076005a6-715b-44d7-9162-423fc55eb31d" containerName="collect-profiles" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.264116 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="076005a6-715b-44d7-9162-423fc55eb31d" containerName="collect-profiles" Dec 01 09:45:13 crc kubenswrapper[4867]: E1201 09:45:13.264149 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae71b24-0c2d-46fa-a8e4-3fb8261f6817" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.264220 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae71b24-0c2d-46fa-a8e4-3fb8261f6817" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.264666 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="076005a6-715b-44d7-9162-423fc55eb31d" containerName="collect-profiles" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.264736 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ae71b24-0c2d-46fa-a8e4-3fb8261f6817" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.265749 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.273442 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49"] Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.273786 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.273934 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.274111 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.274273 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.274315 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.274357 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zvcpg" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.274478 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.274511 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.282509 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.282577 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.282640 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.282677 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.282716 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m8f2\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-kube-api-access-2m8f2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.282749 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.282773 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.282987 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.283051 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.283112 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.283210 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.283252 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.283288 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.283354 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.385142 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.385204 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m8f2\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-kube-api-access-2m8f2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.385240 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.385264 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.385309 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.385335 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.385367 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.385410 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.385435 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.385469 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.385524 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.385567 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.385608 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.385683 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.392116 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.393717 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.394762 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.394951 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.395744 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.396030 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.396240 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.396255 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.396725 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.397526 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.398279 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.398572 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.398881 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.404285 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m8f2\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-kube-api-access-2m8f2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b8r49\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:13 crc kubenswrapper[4867]: I1201 09:45:13.590870 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:14 crc kubenswrapper[4867]: I1201 09:45:14.149782 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49"] Dec 01 09:45:14 crc kubenswrapper[4867]: I1201 09:45:14.206412 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" event={"ID":"c4e1c416-5403-4334-bf63-019f8546a2ab","Type":"ContainerStarted","Data":"cfce02c96da68c01bd753881fb353a770dcb53fdc9149e01e52ca3614c489d62"} Dec 01 09:45:15 crc kubenswrapper[4867]: I1201 09:45:15.216048 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" event={"ID":"c4e1c416-5403-4334-bf63-019f8546a2ab","Type":"ContainerStarted","Data":"2918086e3bf419a85005a7a8a16f09822b0b380b6da1cb0d3f99e3f360bf1ceb"} Dec 01 09:45:15 crc kubenswrapper[4867]: I1201 09:45:15.240849 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" podStartSLOduration=2.066002182 podStartE2EDuration="2.240832809s" podCreationTimestamp="2025-12-01 09:45:13 +0000 UTC" firstStartedPulling="2025-12-01 09:45:14.159493824 +0000 UTC m=+2235.618880578" lastFinishedPulling="2025-12-01 09:45:14.334324451 +0000 UTC m=+2235.793711205" observedRunningTime="2025-12-01 09:45:15.233322112 +0000 UTC m=+2236.692708866" watchObservedRunningTime="2025-12-01 09:45:15.240832809 +0000 UTC m=+2236.700219563" Dec 01 09:45:21 crc kubenswrapper[4867]: I1201 09:45:21.356834 4867 scope.go:117] "RemoveContainer" containerID="8f899fc084ad650caac3f1299eb696d18ff910f8c08ff8465a177917e24b2f4e" Dec 01 09:45:21 crc kubenswrapper[4867]: I1201 09:45:21.601129 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:45:21 crc kubenswrapper[4867]: I1201 09:45:21.601765 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:45:21 crc kubenswrapper[4867]: I1201 09:45:21.601905 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:45:21 crc kubenswrapper[4867]: I1201 09:45:21.602659 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb"} pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:45:21 crc kubenswrapper[4867]: I1201 09:45:21.602805 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" containerID="cri-o://2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" gracePeriod=600 Dec 01 09:45:21 crc kubenswrapper[4867]: E1201 09:45:21.721577 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:45:22 crc kubenswrapper[4867]: I1201 09:45:22.282460 4867 generic.go:334] "Generic (PLEG): container finished" podID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" exitCode=0 Dec 01 09:45:22 crc kubenswrapper[4867]: I1201 09:45:22.282504 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerDied","Data":"2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb"} Dec 01 09:45:22 crc kubenswrapper[4867]: I1201 09:45:22.282540 4867 scope.go:117] "RemoveContainer" containerID="bd7e219b0f0fd8d4af12e79f8e7905ebebc9a75776aa03953e5ecb9bd9c9bd42" Dec 01 09:45:22 crc kubenswrapper[4867]: I1201 09:45:22.283257 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:45:22 crc kubenswrapper[4867]: E1201 09:45:22.283557 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:45:36 crc kubenswrapper[4867]: I1201 09:45:36.837664 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:45:36 crc kubenswrapper[4867]: E1201 09:45:36.838474 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:45:50 crc kubenswrapper[4867]: I1201 09:45:50.827059 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:45:50 crc kubenswrapper[4867]: E1201 09:45:50.828090 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:45:54 crc kubenswrapper[4867]: I1201 09:45:54.564029 4867 generic.go:334] "Generic (PLEG): container finished" podID="c4e1c416-5403-4334-bf63-019f8546a2ab" containerID="2918086e3bf419a85005a7a8a16f09822b0b380b6da1cb0d3f99e3f360bf1ceb" exitCode=0 Dec 01 09:45:54 crc kubenswrapper[4867]: I1201 09:45:54.564173 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" event={"ID":"c4e1c416-5403-4334-bf63-019f8546a2ab","Type":"ContainerDied","Data":"2918086e3bf419a85005a7a8a16f09822b0b380b6da1cb0d3f99e3f360bf1ceb"} Dec 01 09:45:55 crc kubenswrapper[4867]: I1201 09:45:55.989373 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.100082 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"c4e1c416-5403-4334-bf63-019f8546a2ab\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.100191 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-bootstrap-combined-ca-bundle\") pod \"c4e1c416-5403-4334-bf63-019f8546a2ab\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.100246 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-ovn-default-certs-0\") pod \"c4e1c416-5403-4334-bf63-019f8546a2ab\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.100333 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-inventory\") pod \"c4e1c416-5403-4334-bf63-019f8546a2ab\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.100377 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-repo-setup-combined-ca-bundle\") pod \"c4e1c416-5403-4334-bf63-019f8546a2ab\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.100408 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-ovn-combined-ca-bundle\") pod \"c4e1c416-5403-4334-bf63-019f8546a2ab\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.100523 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"c4e1c416-5403-4334-bf63-019f8546a2ab\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.100559 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-neutron-metadata-combined-ca-bundle\") pod \"c4e1c416-5403-4334-bf63-019f8546a2ab\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.100641 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-telemetry-combined-ca-bundle\") pod \"c4e1c416-5403-4334-bf63-019f8546a2ab\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.100666 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-nova-combined-ca-bundle\") pod \"c4e1c416-5403-4334-bf63-019f8546a2ab\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.100695 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-ssh-key\") pod \"c4e1c416-5403-4334-bf63-019f8546a2ab\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.100716 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-libvirt-combined-ca-bundle\") pod \"c4e1c416-5403-4334-bf63-019f8546a2ab\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.100736 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"c4e1c416-5403-4334-bf63-019f8546a2ab\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.100794 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m8f2\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-kube-api-access-2m8f2\") pod \"c4e1c416-5403-4334-bf63-019f8546a2ab\" (UID: \"c4e1c416-5403-4334-bf63-019f8546a2ab\") " Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.105483 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "c4e1c416-5403-4334-bf63-019f8546a2ab" (UID: "c4e1c416-5403-4334-bf63-019f8546a2ab"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.106528 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-kube-api-access-2m8f2" (OuterVolumeSpecName: "kube-api-access-2m8f2") pod "c4e1c416-5403-4334-bf63-019f8546a2ab" (UID: "c4e1c416-5403-4334-bf63-019f8546a2ab"). InnerVolumeSpecName "kube-api-access-2m8f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.107198 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c4e1c416-5403-4334-bf63-019f8546a2ab" (UID: "c4e1c416-5403-4334-bf63-019f8546a2ab"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.107429 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "c4e1c416-5403-4334-bf63-019f8546a2ab" (UID: "c4e1c416-5403-4334-bf63-019f8546a2ab"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.109731 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c4e1c416-5403-4334-bf63-019f8546a2ab" (UID: "c4e1c416-5403-4334-bf63-019f8546a2ab"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.109902 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "c4e1c416-5403-4334-bf63-019f8546a2ab" (UID: "c4e1c416-5403-4334-bf63-019f8546a2ab"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.109911 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c4e1c416-5403-4334-bf63-019f8546a2ab" (UID: "c4e1c416-5403-4334-bf63-019f8546a2ab"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.110013 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "c4e1c416-5403-4334-bf63-019f8546a2ab" (UID: "c4e1c416-5403-4334-bf63-019f8546a2ab"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.111717 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "c4e1c416-5403-4334-bf63-019f8546a2ab" (UID: "c4e1c416-5403-4334-bf63-019f8546a2ab"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.111734 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "c4e1c416-5403-4334-bf63-019f8546a2ab" (UID: "c4e1c416-5403-4334-bf63-019f8546a2ab"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.113054 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "c4e1c416-5403-4334-bf63-019f8546a2ab" (UID: "c4e1c416-5403-4334-bf63-019f8546a2ab"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.121339 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "c4e1c416-5403-4334-bf63-019f8546a2ab" (UID: "c4e1c416-5403-4334-bf63-019f8546a2ab"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.131335 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c4e1c416-5403-4334-bf63-019f8546a2ab" (UID: "c4e1c416-5403-4334-bf63-019f8546a2ab"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.143036 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-inventory" (OuterVolumeSpecName: "inventory") pod "c4e1c416-5403-4334-bf63-019f8546a2ab" (UID: "c4e1c416-5403-4334-bf63-019f8546a2ab"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.202500 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m8f2\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-kube-api-access-2m8f2\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.202892 4867 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.202909 4867 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.202923 4867 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.202937 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.202948 4867 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.202959 4867 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.202969 4867 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.202979 4867 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.202990 4867 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.203000 4867 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.203010 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.203019 4867 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e1c416-5403-4334-bf63-019f8546a2ab-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.203030 4867 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4e1c416-5403-4334-bf63-019f8546a2ab-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.589269 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" event={"ID":"c4e1c416-5403-4334-bf63-019f8546a2ab","Type":"ContainerDied","Data":"cfce02c96da68c01bd753881fb353a770dcb53fdc9149e01e52ca3614c489d62"} Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.590406 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfce02c96da68c01bd753881fb353a770dcb53fdc9149e01e52ca3614c489d62" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.589388 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b8r49" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.763293 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t"] Dec 01 09:45:56 crc kubenswrapper[4867]: E1201 09:45:56.763731 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4e1c416-5403-4334-bf63-019f8546a2ab" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.763756 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e1c416-5403-4334-bf63-019f8546a2ab" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.764015 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4e1c416-5403-4334-bf63-019f8546a2ab" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.764817 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.770705 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zvcpg" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.770800 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.771118 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.771268 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.771448 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.775250 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t"] Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.812351 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wpt2t\" (UID: \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.812673 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wpt2t\" (UID: \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.812973 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwlm5\" (UniqueName: \"kubernetes.io/projected/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-kube-api-access-nwlm5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wpt2t\" (UID: \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.813129 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wpt2t\" (UID: \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.813234 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wpt2t\" (UID: \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.914934 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wpt2t\" (UID: \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.915250 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wpt2t\" (UID: \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.915411 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwlm5\" (UniqueName: \"kubernetes.io/projected/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-kube-api-access-nwlm5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wpt2t\" (UID: \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.915539 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wpt2t\" (UID: \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.915644 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wpt2t\" (UID: \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.916312 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wpt2t\" (UID: \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.919062 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wpt2t\" (UID: \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.921742 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wpt2t\" (UID: \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.928678 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wpt2t\" (UID: \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" Dec 01 09:45:56 crc kubenswrapper[4867]: I1201 09:45:56.935527 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwlm5\" (UniqueName: \"kubernetes.io/projected/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-kube-api-access-nwlm5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wpt2t\" (UID: \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" Dec 01 09:45:57 crc kubenswrapper[4867]: I1201 09:45:57.121411 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" Dec 01 09:45:57 crc kubenswrapper[4867]: I1201 09:45:57.643045 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t"] Dec 01 09:45:58 crc kubenswrapper[4867]: I1201 09:45:58.605729 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" event={"ID":"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7","Type":"ContainerStarted","Data":"dc9473d03a00f2520b22d43261026421603100150364450bd9d262611235a6ed"} Dec 01 09:45:58 crc kubenswrapper[4867]: I1201 09:45:58.606219 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" event={"ID":"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7","Type":"ContainerStarted","Data":"05ca3c6ffc8c1f6e38cdaf7f9c27214f8355fb01387cf222b2d157602d9f1bb6"} Dec 01 09:46:01 crc kubenswrapper[4867]: I1201 09:46:01.828173 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:46:01 crc kubenswrapper[4867]: E1201 09:46:01.829112 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:46:16 crc kubenswrapper[4867]: I1201 09:46:16.827274 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:46:16 crc kubenswrapper[4867]: E1201 09:46:16.828024 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:46:27 crc kubenswrapper[4867]: I1201 09:46:27.837983 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" podStartSLOduration=31.65161399 podStartE2EDuration="31.837966333s" podCreationTimestamp="2025-12-01 09:45:56 +0000 UTC" firstStartedPulling="2025-12-01 09:45:57.650141507 +0000 UTC m=+2279.109528261" lastFinishedPulling="2025-12-01 09:45:57.83649385 +0000 UTC m=+2279.295880604" observedRunningTime="2025-12-01 09:45:58.619719645 +0000 UTC m=+2280.079106399" watchObservedRunningTime="2025-12-01 09:46:27.837966333 +0000 UTC m=+2309.297353097" Dec 01 09:46:27 crc kubenswrapper[4867]: I1201 09:46:27.847793 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6zh7m"] Dec 01 09:46:27 crc kubenswrapper[4867]: I1201 09:46:27.849620 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6zh7m" Dec 01 09:46:27 crc kubenswrapper[4867]: I1201 09:46:27.864577 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6zh7m"] Dec 01 09:46:27 crc kubenswrapper[4867]: I1201 09:46:27.916313 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8edb1d42-d620-45f8-aa51-71d8c983feaf-catalog-content\") pod \"community-operators-6zh7m\" (UID: \"8edb1d42-d620-45f8-aa51-71d8c983feaf\") " pod="openshift-marketplace/community-operators-6zh7m" Dec 01 09:46:27 crc kubenswrapper[4867]: I1201 09:46:27.916397 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8edb1d42-d620-45f8-aa51-71d8c983feaf-utilities\") pod \"community-operators-6zh7m\" (UID: \"8edb1d42-d620-45f8-aa51-71d8c983feaf\") " pod="openshift-marketplace/community-operators-6zh7m" Dec 01 09:46:27 crc kubenswrapper[4867]: I1201 09:46:27.916467 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnzrq\" (UniqueName: \"kubernetes.io/projected/8edb1d42-d620-45f8-aa51-71d8c983feaf-kube-api-access-wnzrq\") pod \"community-operators-6zh7m\" (UID: \"8edb1d42-d620-45f8-aa51-71d8c983feaf\") " pod="openshift-marketplace/community-operators-6zh7m" Dec 01 09:46:28 crc kubenswrapper[4867]: I1201 09:46:28.018109 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8edb1d42-d620-45f8-aa51-71d8c983feaf-catalog-content\") pod \"community-operators-6zh7m\" (UID: \"8edb1d42-d620-45f8-aa51-71d8c983feaf\") " pod="openshift-marketplace/community-operators-6zh7m" Dec 01 09:46:28 crc kubenswrapper[4867]: I1201 09:46:28.018190 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8edb1d42-d620-45f8-aa51-71d8c983feaf-utilities\") pod \"community-operators-6zh7m\" (UID: \"8edb1d42-d620-45f8-aa51-71d8c983feaf\") " pod="openshift-marketplace/community-operators-6zh7m" Dec 01 09:46:28 crc kubenswrapper[4867]: I1201 09:46:28.018239 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnzrq\" (UniqueName: \"kubernetes.io/projected/8edb1d42-d620-45f8-aa51-71d8c983feaf-kube-api-access-wnzrq\") pod \"community-operators-6zh7m\" (UID: \"8edb1d42-d620-45f8-aa51-71d8c983feaf\") " pod="openshift-marketplace/community-operators-6zh7m" Dec 01 09:46:28 crc kubenswrapper[4867]: I1201 09:46:28.018877 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8edb1d42-d620-45f8-aa51-71d8c983feaf-utilities\") pod \"community-operators-6zh7m\" (UID: \"8edb1d42-d620-45f8-aa51-71d8c983feaf\") " pod="openshift-marketplace/community-operators-6zh7m" Dec 01 09:46:28 crc kubenswrapper[4867]: I1201 09:46:28.019075 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8edb1d42-d620-45f8-aa51-71d8c983feaf-catalog-content\") pod \"community-operators-6zh7m\" (UID: \"8edb1d42-d620-45f8-aa51-71d8c983feaf\") " pod="openshift-marketplace/community-operators-6zh7m" Dec 01 09:46:28 crc kubenswrapper[4867]: I1201 09:46:28.040664 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnzrq\" (UniqueName: \"kubernetes.io/projected/8edb1d42-d620-45f8-aa51-71d8c983feaf-kube-api-access-wnzrq\") pod \"community-operators-6zh7m\" (UID: \"8edb1d42-d620-45f8-aa51-71d8c983feaf\") " pod="openshift-marketplace/community-operators-6zh7m" Dec 01 09:46:28 crc kubenswrapper[4867]: I1201 09:46:28.177863 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6zh7m" Dec 01 09:46:28 crc kubenswrapper[4867]: I1201 09:46:28.834408 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:46:28 crc kubenswrapper[4867]: E1201 09:46:28.835242 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:46:28 crc kubenswrapper[4867]: I1201 09:46:28.947680 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6zh7m"] Dec 01 09:46:29 crc kubenswrapper[4867]: I1201 09:46:29.853320 4867 generic.go:334] "Generic (PLEG): container finished" podID="8edb1d42-d620-45f8-aa51-71d8c983feaf" containerID="c0a251c49bc4b923cecc51c30ae6623652f3589a0317468b2f8a84deab3ff8be" exitCode=0 Dec 01 09:46:29 crc kubenswrapper[4867]: I1201 09:46:29.853390 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zh7m" event={"ID":"8edb1d42-d620-45f8-aa51-71d8c983feaf","Type":"ContainerDied","Data":"c0a251c49bc4b923cecc51c30ae6623652f3589a0317468b2f8a84deab3ff8be"} Dec 01 09:46:29 crc kubenswrapper[4867]: I1201 09:46:29.853611 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zh7m" event={"ID":"8edb1d42-d620-45f8-aa51-71d8c983feaf","Type":"ContainerStarted","Data":"607dde3708df49639889db63456887da240878161b93e7e0a3efce0dcbff5224"} Dec 01 09:46:30 crc kubenswrapper[4867]: I1201 09:46:30.870486 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zh7m" event={"ID":"8edb1d42-d620-45f8-aa51-71d8c983feaf","Type":"ContainerStarted","Data":"28e414862f980815550ea52fd04e3ad09fce6ea2be94872b9444c05740cf1002"} Dec 01 09:46:31 crc kubenswrapper[4867]: I1201 09:46:31.882233 4867 generic.go:334] "Generic (PLEG): container finished" podID="8edb1d42-d620-45f8-aa51-71d8c983feaf" containerID="28e414862f980815550ea52fd04e3ad09fce6ea2be94872b9444c05740cf1002" exitCode=0 Dec 01 09:46:31 crc kubenswrapper[4867]: I1201 09:46:31.882345 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zh7m" event={"ID":"8edb1d42-d620-45f8-aa51-71d8c983feaf","Type":"ContainerDied","Data":"28e414862f980815550ea52fd04e3ad09fce6ea2be94872b9444c05740cf1002"} Dec 01 09:46:32 crc kubenswrapper[4867]: I1201 09:46:32.896281 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zh7m" event={"ID":"8edb1d42-d620-45f8-aa51-71d8c983feaf","Type":"ContainerStarted","Data":"62b575e7427d6ae8d6f5cf8fe8fc455a2329fc8957086747cddc5c52ecf2e247"} Dec 01 09:46:32 crc kubenswrapper[4867]: I1201 09:46:32.920425 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6zh7m" podStartSLOduration=3.205485033 podStartE2EDuration="5.920406699s" podCreationTimestamp="2025-12-01 09:46:27 +0000 UTC" firstStartedPulling="2025-12-01 09:46:29.855611392 +0000 UTC m=+2311.314998146" lastFinishedPulling="2025-12-01 09:46:32.570533058 +0000 UTC m=+2314.029919812" observedRunningTime="2025-12-01 09:46:32.916397638 +0000 UTC m=+2314.375784392" watchObservedRunningTime="2025-12-01 09:46:32.920406699 +0000 UTC m=+2314.379793453" Dec 01 09:46:38 crc kubenswrapper[4867]: I1201 09:46:38.179107 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6zh7m" Dec 01 09:46:38 crc kubenswrapper[4867]: I1201 09:46:38.179626 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6zh7m" Dec 01 09:46:38 crc kubenswrapper[4867]: I1201 09:46:38.224176 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6zh7m" Dec 01 09:46:38 crc kubenswrapper[4867]: I1201 09:46:38.994550 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6zh7m" Dec 01 09:46:39 crc kubenswrapper[4867]: I1201 09:46:39.039560 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6zh7m"] Dec 01 09:46:40 crc kubenswrapper[4867]: I1201 09:46:40.826988 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:46:40 crc kubenswrapper[4867]: E1201 09:46:40.827665 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:46:40 crc kubenswrapper[4867]: I1201 09:46:40.972281 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6zh7m" podUID="8edb1d42-d620-45f8-aa51-71d8c983feaf" containerName="registry-server" containerID="cri-o://62b575e7427d6ae8d6f5cf8fe8fc455a2329fc8957086747cddc5c52ecf2e247" gracePeriod=2 Dec 01 09:46:41 crc kubenswrapper[4867]: I1201 09:46:41.421852 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6zh7m" Dec 01 09:46:41 crc kubenswrapper[4867]: I1201 09:46:41.504957 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8edb1d42-d620-45f8-aa51-71d8c983feaf-utilities\") pod \"8edb1d42-d620-45f8-aa51-71d8c983feaf\" (UID: \"8edb1d42-d620-45f8-aa51-71d8c983feaf\") " Dec 01 09:46:41 crc kubenswrapper[4867]: I1201 09:46:41.505141 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8edb1d42-d620-45f8-aa51-71d8c983feaf-catalog-content\") pod \"8edb1d42-d620-45f8-aa51-71d8c983feaf\" (UID: \"8edb1d42-d620-45f8-aa51-71d8c983feaf\") " Dec 01 09:46:41 crc kubenswrapper[4867]: I1201 09:46:41.505361 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnzrq\" (UniqueName: \"kubernetes.io/projected/8edb1d42-d620-45f8-aa51-71d8c983feaf-kube-api-access-wnzrq\") pod \"8edb1d42-d620-45f8-aa51-71d8c983feaf\" (UID: \"8edb1d42-d620-45f8-aa51-71d8c983feaf\") " Dec 01 09:46:41 crc kubenswrapper[4867]: I1201 09:46:41.507705 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8edb1d42-d620-45f8-aa51-71d8c983feaf-utilities" (OuterVolumeSpecName: "utilities") pod "8edb1d42-d620-45f8-aa51-71d8c983feaf" (UID: "8edb1d42-d620-45f8-aa51-71d8c983feaf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:46:41 crc kubenswrapper[4867]: I1201 09:46:41.519286 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8edb1d42-d620-45f8-aa51-71d8c983feaf-kube-api-access-wnzrq" (OuterVolumeSpecName: "kube-api-access-wnzrq") pod "8edb1d42-d620-45f8-aa51-71d8c983feaf" (UID: "8edb1d42-d620-45f8-aa51-71d8c983feaf"). InnerVolumeSpecName "kube-api-access-wnzrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:46:41 crc kubenswrapper[4867]: I1201 09:46:41.558005 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8edb1d42-d620-45f8-aa51-71d8c983feaf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8edb1d42-d620-45f8-aa51-71d8c983feaf" (UID: "8edb1d42-d620-45f8-aa51-71d8c983feaf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:46:41 crc kubenswrapper[4867]: I1201 09:46:41.607407 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnzrq\" (UniqueName: \"kubernetes.io/projected/8edb1d42-d620-45f8-aa51-71d8c983feaf-kube-api-access-wnzrq\") on node \"crc\" DevicePath \"\"" Dec 01 09:46:41 crc kubenswrapper[4867]: I1201 09:46:41.607455 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8edb1d42-d620-45f8-aa51-71d8c983feaf-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:46:41 crc kubenswrapper[4867]: I1201 09:46:41.607483 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8edb1d42-d620-45f8-aa51-71d8c983feaf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:46:41 crc kubenswrapper[4867]: I1201 09:46:41.985650 4867 generic.go:334] "Generic (PLEG): container finished" podID="8edb1d42-d620-45f8-aa51-71d8c983feaf" containerID="62b575e7427d6ae8d6f5cf8fe8fc455a2329fc8957086747cddc5c52ecf2e247" exitCode=0 Dec 01 09:46:41 crc kubenswrapper[4867]: I1201 09:46:41.985695 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zh7m" event={"ID":"8edb1d42-d620-45f8-aa51-71d8c983feaf","Type":"ContainerDied","Data":"62b575e7427d6ae8d6f5cf8fe8fc455a2329fc8957086747cddc5c52ecf2e247"} Dec 01 09:46:41 crc kubenswrapper[4867]: I1201 09:46:41.985722 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zh7m" event={"ID":"8edb1d42-d620-45f8-aa51-71d8c983feaf","Type":"ContainerDied","Data":"607dde3708df49639889db63456887da240878161b93e7e0a3efce0dcbff5224"} Dec 01 09:46:41 crc kubenswrapper[4867]: I1201 09:46:41.985738 4867 scope.go:117] "RemoveContainer" containerID="62b575e7427d6ae8d6f5cf8fe8fc455a2329fc8957086747cddc5c52ecf2e247" Dec 01 09:46:41 crc kubenswrapper[4867]: I1201 09:46:41.985935 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6zh7m" Dec 01 09:46:42 crc kubenswrapper[4867]: I1201 09:46:42.032133 4867 scope.go:117] "RemoveContainer" containerID="28e414862f980815550ea52fd04e3ad09fce6ea2be94872b9444c05740cf1002" Dec 01 09:46:42 crc kubenswrapper[4867]: I1201 09:46:42.061875 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6zh7m"] Dec 01 09:46:42 crc kubenswrapper[4867]: I1201 09:46:42.072855 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6zh7m"] Dec 01 09:46:42 crc kubenswrapper[4867]: I1201 09:46:42.080299 4867 scope.go:117] "RemoveContainer" containerID="c0a251c49bc4b923cecc51c30ae6623652f3589a0317468b2f8a84deab3ff8be" Dec 01 09:46:42 crc kubenswrapper[4867]: I1201 09:46:42.110839 4867 scope.go:117] "RemoveContainer" containerID="62b575e7427d6ae8d6f5cf8fe8fc455a2329fc8957086747cddc5c52ecf2e247" Dec 01 09:46:42 crc kubenswrapper[4867]: E1201 09:46:42.111327 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62b575e7427d6ae8d6f5cf8fe8fc455a2329fc8957086747cddc5c52ecf2e247\": container with ID starting with 62b575e7427d6ae8d6f5cf8fe8fc455a2329fc8957086747cddc5c52ecf2e247 not found: ID does not exist" containerID="62b575e7427d6ae8d6f5cf8fe8fc455a2329fc8957086747cddc5c52ecf2e247" Dec 01 09:46:42 crc kubenswrapper[4867]: I1201 09:46:42.111368 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b575e7427d6ae8d6f5cf8fe8fc455a2329fc8957086747cddc5c52ecf2e247"} err="failed to get container status \"62b575e7427d6ae8d6f5cf8fe8fc455a2329fc8957086747cddc5c52ecf2e247\": rpc error: code = NotFound desc = could not find container \"62b575e7427d6ae8d6f5cf8fe8fc455a2329fc8957086747cddc5c52ecf2e247\": container with ID starting with 62b575e7427d6ae8d6f5cf8fe8fc455a2329fc8957086747cddc5c52ecf2e247 not found: ID does not exist" Dec 01 09:46:42 crc kubenswrapper[4867]: I1201 09:46:42.111390 4867 scope.go:117] "RemoveContainer" containerID="28e414862f980815550ea52fd04e3ad09fce6ea2be94872b9444c05740cf1002" Dec 01 09:46:42 crc kubenswrapper[4867]: E1201 09:46:42.111723 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28e414862f980815550ea52fd04e3ad09fce6ea2be94872b9444c05740cf1002\": container with ID starting with 28e414862f980815550ea52fd04e3ad09fce6ea2be94872b9444c05740cf1002 not found: ID does not exist" containerID="28e414862f980815550ea52fd04e3ad09fce6ea2be94872b9444c05740cf1002" Dec 01 09:46:42 crc kubenswrapper[4867]: I1201 09:46:42.111745 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28e414862f980815550ea52fd04e3ad09fce6ea2be94872b9444c05740cf1002"} err="failed to get container status \"28e414862f980815550ea52fd04e3ad09fce6ea2be94872b9444c05740cf1002\": rpc error: code = NotFound desc = could not find container \"28e414862f980815550ea52fd04e3ad09fce6ea2be94872b9444c05740cf1002\": container with ID starting with 28e414862f980815550ea52fd04e3ad09fce6ea2be94872b9444c05740cf1002 not found: ID does not exist" Dec 01 09:46:42 crc kubenswrapper[4867]: I1201 09:46:42.111758 4867 scope.go:117] "RemoveContainer" containerID="c0a251c49bc4b923cecc51c30ae6623652f3589a0317468b2f8a84deab3ff8be" Dec 01 09:46:42 crc kubenswrapper[4867]: E1201 09:46:42.112279 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0a251c49bc4b923cecc51c30ae6623652f3589a0317468b2f8a84deab3ff8be\": container with ID starting with c0a251c49bc4b923cecc51c30ae6623652f3589a0317468b2f8a84deab3ff8be not found: ID does not exist" containerID="c0a251c49bc4b923cecc51c30ae6623652f3589a0317468b2f8a84deab3ff8be" Dec 01 09:46:42 crc kubenswrapper[4867]: I1201 09:46:42.112303 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0a251c49bc4b923cecc51c30ae6623652f3589a0317468b2f8a84deab3ff8be"} err="failed to get container status \"c0a251c49bc4b923cecc51c30ae6623652f3589a0317468b2f8a84deab3ff8be\": rpc error: code = NotFound desc = could not find container \"c0a251c49bc4b923cecc51c30ae6623652f3589a0317468b2f8a84deab3ff8be\": container with ID starting with c0a251c49bc4b923cecc51c30ae6623652f3589a0317468b2f8a84deab3ff8be not found: ID does not exist" Dec 01 09:46:42 crc kubenswrapper[4867]: I1201 09:46:42.836666 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8edb1d42-d620-45f8-aa51-71d8c983feaf" path="/var/lib/kubelet/pods/8edb1d42-d620-45f8-aa51-71d8c983feaf/volumes" Dec 01 09:46:54 crc kubenswrapper[4867]: I1201 09:46:54.827583 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:46:54 crc kubenswrapper[4867]: E1201 09:46:54.828511 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:47:07 crc kubenswrapper[4867]: I1201 09:47:07.827233 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:47:07 crc kubenswrapper[4867]: E1201 09:47:07.828043 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:47:11 crc kubenswrapper[4867]: I1201 09:47:11.268943 4867 generic.go:334] "Generic (PLEG): container finished" podID="5f9d5cec-8d85-4f56-b876-06c32bb0a3e7" containerID="dc9473d03a00f2520b22d43261026421603100150364450bd9d262611235a6ed" exitCode=0 Dec 01 09:47:11 crc kubenswrapper[4867]: I1201 09:47:11.269060 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" event={"ID":"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7","Type":"ContainerDied","Data":"dc9473d03a00f2520b22d43261026421603100150364450bd9d262611235a6ed"} Dec 01 09:47:12 crc kubenswrapper[4867]: I1201 09:47:12.693161 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" Dec 01 09:47:12 crc kubenswrapper[4867]: I1201 09:47:12.841568 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-ssh-key\") pod \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\" (UID: \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\") " Dec 01 09:47:12 crc kubenswrapper[4867]: I1201 09:47:12.841615 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwlm5\" (UniqueName: \"kubernetes.io/projected/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-kube-api-access-nwlm5\") pod \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\" (UID: \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\") " Dec 01 09:47:12 crc kubenswrapper[4867]: I1201 09:47:12.841749 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-inventory\") pod \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\" (UID: \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\") " Dec 01 09:47:12 crc kubenswrapper[4867]: I1201 09:47:12.841843 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-ovn-combined-ca-bundle\") pod \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\" (UID: \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\") " Dec 01 09:47:12 crc kubenswrapper[4867]: I1201 09:47:12.841935 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-ovncontroller-config-0\") pod \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\" (UID: \"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7\") " Dec 01 09:47:12 crc kubenswrapper[4867]: I1201 09:47:12.847558 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "5f9d5cec-8d85-4f56-b876-06c32bb0a3e7" (UID: "5f9d5cec-8d85-4f56-b876-06c32bb0a3e7"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:47:12 crc kubenswrapper[4867]: I1201 09:47:12.853629 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-kube-api-access-nwlm5" (OuterVolumeSpecName: "kube-api-access-nwlm5") pod "5f9d5cec-8d85-4f56-b876-06c32bb0a3e7" (UID: "5f9d5cec-8d85-4f56-b876-06c32bb0a3e7"). InnerVolumeSpecName "kube-api-access-nwlm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:47:12 crc kubenswrapper[4867]: I1201 09:47:12.871601 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5f9d5cec-8d85-4f56-b876-06c32bb0a3e7" (UID: "5f9d5cec-8d85-4f56-b876-06c32bb0a3e7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:47:12 crc kubenswrapper[4867]: I1201 09:47:12.880785 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-inventory" (OuterVolumeSpecName: "inventory") pod "5f9d5cec-8d85-4f56-b876-06c32bb0a3e7" (UID: "5f9d5cec-8d85-4f56-b876-06c32bb0a3e7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:47:12 crc kubenswrapper[4867]: I1201 09:47:12.886553 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "5f9d5cec-8d85-4f56-b876-06c32bb0a3e7" (UID: "5f9d5cec-8d85-4f56-b876-06c32bb0a3e7"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:47:12 crc kubenswrapper[4867]: I1201 09:47:12.944940 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:47:12 crc kubenswrapper[4867]: I1201 09:47:12.944980 4867 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:47:12 crc kubenswrapper[4867]: I1201 09:47:12.945015 4867 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:47:12 crc kubenswrapper[4867]: I1201 09:47:12.945027 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:47:12 crc kubenswrapper[4867]: I1201 09:47:12.945039 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwlm5\" (UniqueName: \"kubernetes.io/projected/5f9d5cec-8d85-4f56-b876-06c32bb0a3e7-kube-api-access-nwlm5\") on node \"crc\" DevicePath \"\"" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.285456 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" event={"ID":"5f9d5cec-8d85-4f56-b876-06c32bb0a3e7","Type":"ContainerDied","Data":"05ca3c6ffc8c1f6e38cdaf7f9c27214f8355fb01387cf222b2d157602d9f1bb6"} Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.286024 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05ca3c6ffc8c1f6e38cdaf7f9c27214f8355fb01387cf222b2d157602d9f1bb6" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.285710 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wpt2t" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.470503 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66"] Dec 01 09:47:13 crc kubenswrapper[4867]: E1201 09:47:13.471241 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9d5cec-8d85-4f56-b876-06c32bb0a3e7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.471341 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9d5cec-8d85-4f56-b876-06c32bb0a3e7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 09:47:13 crc kubenswrapper[4867]: E1201 09:47:13.471457 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8edb1d42-d620-45f8-aa51-71d8c983feaf" containerName="extract-utilities" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.471549 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8edb1d42-d620-45f8-aa51-71d8c983feaf" containerName="extract-utilities" Dec 01 09:47:13 crc kubenswrapper[4867]: E1201 09:47:13.471679 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8edb1d42-d620-45f8-aa51-71d8c983feaf" containerName="extract-content" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.471774 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8edb1d42-d620-45f8-aa51-71d8c983feaf" containerName="extract-content" Dec 01 09:47:13 crc kubenswrapper[4867]: E1201 09:47:13.471879 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8edb1d42-d620-45f8-aa51-71d8c983feaf" containerName="registry-server" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.471953 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8edb1d42-d620-45f8-aa51-71d8c983feaf" containerName="registry-server" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.472272 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f9d5cec-8d85-4f56-b876-06c32bb0a3e7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.472392 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8edb1d42-d620-45f8-aa51-71d8c983feaf" containerName="registry-server" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.473396 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.479648 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.480407 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zvcpg" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.480917 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.481271 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.481638 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.482040 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.491807 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66"] Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.656484 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.656785 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdjmb\" (UniqueName: \"kubernetes.io/projected/27c456e0-7f00-42b5-b4e7-c5120389d2c1-kube-api-access-kdjmb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.656934 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.657074 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.657168 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.657288 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.758980 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.759093 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.759125 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdjmb\" (UniqueName: \"kubernetes.io/projected/27c456e0-7f00-42b5-b4e7-c5120389d2c1-kube-api-access-kdjmb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.759203 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.759311 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.759354 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.763981 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.764104 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.764646 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.765395 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.767867 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.782034 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdjmb\" (UniqueName: \"kubernetes.io/projected/27c456e0-7f00-42b5-b4e7-c5120389d2c1-kube-api-access-kdjmb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:47:13 crc kubenswrapper[4867]: I1201 09:47:13.836367 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:47:14 crc kubenswrapper[4867]: I1201 09:47:14.442639 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66"] Dec 01 09:47:15 crc kubenswrapper[4867]: I1201 09:47:15.305396 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" event={"ID":"27c456e0-7f00-42b5-b4e7-c5120389d2c1","Type":"ContainerStarted","Data":"1a36ff2e43a44b5e66ed3f2fae818d91e1e38869ac8c655c6c1ecaac4814386e"} Dec 01 09:47:15 crc kubenswrapper[4867]: I1201 09:47:15.305915 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" event={"ID":"27c456e0-7f00-42b5-b4e7-c5120389d2c1","Type":"ContainerStarted","Data":"e264ef5f3233f773ac9354d6063243794ad58caa470e4d54e72f5dde0b6c5a5a"} Dec 01 09:47:15 crc kubenswrapper[4867]: I1201 09:47:15.328783 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" podStartSLOduration=2.133035307 podStartE2EDuration="2.328767049s" podCreationTimestamp="2025-12-01 09:47:13 +0000 UTC" firstStartedPulling="2025-12-01 09:47:14.404991228 +0000 UTC m=+2355.864377982" lastFinishedPulling="2025-12-01 09:47:14.60072297 +0000 UTC m=+2356.060109724" observedRunningTime="2025-12-01 09:47:15.325366756 +0000 UTC m=+2356.784753510" watchObservedRunningTime="2025-12-01 09:47:15.328767049 +0000 UTC m=+2356.788153803" Dec 01 09:47:18 crc kubenswrapper[4867]: I1201 09:47:18.850526 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:47:18 crc kubenswrapper[4867]: E1201 09:47:18.851407 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:47:33 crc kubenswrapper[4867]: I1201 09:47:33.828189 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:47:33 crc kubenswrapper[4867]: E1201 09:47:33.829361 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:47:47 crc kubenswrapper[4867]: I1201 09:47:47.826743 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:47:47 crc kubenswrapper[4867]: E1201 09:47:47.827766 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:48:02 crc kubenswrapper[4867]: I1201 09:48:02.828652 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:48:02 crc kubenswrapper[4867]: E1201 09:48:02.829868 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:48:04 crc kubenswrapper[4867]: I1201 09:48:04.727459 4867 generic.go:334] "Generic (PLEG): container finished" podID="27c456e0-7f00-42b5-b4e7-c5120389d2c1" containerID="1a36ff2e43a44b5e66ed3f2fae818d91e1e38869ac8c655c6c1ecaac4814386e" exitCode=0 Dec 01 09:48:04 crc kubenswrapper[4867]: I1201 09:48:04.727546 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" event={"ID":"27c456e0-7f00-42b5-b4e7-c5120389d2c1","Type":"ContainerDied","Data":"1a36ff2e43a44b5e66ed3f2fae818d91e1e38869ac8c655c6c1ecaac4814386e"} Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.154033 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.240618 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-nova-metadata-neutron-config-0\") pod \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.240757 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-ssh-key\") pod \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.241045 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdjmb\" (UniqueName: \"kubernetes.io/projected/27c456e0-7f00-42b5-b4e7-c5120389d2c1-kube-api-access-kdjmb\") pod \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.241139 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.241180 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-inventory\") pod \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.241229 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-neutron-metadata-combined-ca-bundle\") pod \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\" (UID: \"27c456e0-7f00-42b5-b4e7-c5120389d2c1\") " Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.260821 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "27c456e0-7f00-42b5-b4e7-c5120389d2c1" (UID: "27c456e0-7f00-42b5-b4e7-c5120389d2c1"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.266117 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27c456e0-7f00-42b5-b4e7-c5120389d2c1-kube-api-access-kdjmb" (OuterVolumeSpecName: "kube-api-access-kdjmb") pod "27c456e0-7f00-42b5-b4e7-c5120389d2c1" (UID: "27c456e0-7f00-42b5-b4e7-c5120389d2c1"). InnerVolumeSpecName "kube-api-access-kdjmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.288336 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "27c456e0-7f00-42b5-b4e7-c5120389d2c1" (UID: "27c456e0-7f00-42b5-b4e7-c5120389d2c1"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.290182 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "27c456e0-7f00-42b5-b4e7-c5120389d2c1" (UID: "27c456e0-7f00-42b5-b4e7-c5120389d2c1"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.302371 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-inventory" (OuterVolumeSpecName: "inventory") pod "27c456e0-7f00-42b5-b4e7-c5120389d2c1" (UID: "27c456e0-7f00-42b5-b4e7-c5120389d2c1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.324055 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "27c456e0-7f00-42b5-b4e7-c5120389d2c1" (UID: "27c456e0-7f00-42b5-b4e7-c5120389d2c1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.347062 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdjmb\" (UniqueName: \"kubernetes.io/projected/27c456e0-7f00-42b5-b4e7-c5120389d2c1-kube-api-access-kdjmb\") on node \"crc\" DevicePath \"\"" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.347104 4867 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.347120 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.347138 4867 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.347162 4867 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.347178 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/27c456e0-7f00-42b5-b4e7-c5120389d2c1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.744225 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" event={"ID":"27c456e0-7f00-42b5-b4e7-c5120389d2c1","Type":"ContainerDied","Data":"e264ef5f3233f773ac9354d6063243794ad58caa470e4d54e72f5dde0b6c5a5a"} Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.744550 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e264ef5f3233f773ac9354d6063243794ad58caa470e4d54e72f5dde0b6c5a5a" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.744252 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.848738 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp"] Dec 01 09:48:06 crc kubenswrapper[4867]: E1201 09:48:06.849219 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27c456e0-7f00-42b5-b4e7-c5120389d2c1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.849242 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="27c456e0-7f00-42b5-b4e7-c5120389d2c1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.849455 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="27c456e0-7f00-42b5-b4e7-c5120389d2c1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.851199 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.861093 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.861114 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.861239 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.861587 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zvcpg" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.861708 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.862454 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp"] Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.964511 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp\" (UID: \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.964584 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvjtn\" (UniqueName: \"kubernetes.io/projected/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-kube-api-access-wvjtn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp\" (UID: \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.964627 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp\" (UID: \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.964676 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp\" (UID: \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" Dec 01 09:48:06 crc kubenswrapper[4867]: I1201 09:48:06.964708 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp\" (UID: \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" Dec 01 09:48:07 crc kubenswrapper[4867]: I1201 09:48:07.066636 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp\" (UID: \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" Dec 01 09:48:07 crc kubenswrapper[4867]: I1201 09:48:07.066740 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp\" (UID: \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" Dec 01 09:48:07 crc kubenswrapper[4867]: I1201 09:48:07.066842 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp\" (UID: \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" Dec 01 09:48:07 crc kubenswrapper[4867]: I1201 09:48:07.066971 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp\" (UID: \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" Dec 01 09:48:07 crc kubenswrapper[4867]: I1201 09:48:07.067028 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvjtn\" (UniqueName: \"kubernetes.io/projected/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-kube-api-access-wvjtn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp\" (UID: \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" Dec 01 09:48:07 crc kubenswrapper[4867]: I1201 09:48:07.070530 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp\" (UID: \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" Dec 01 09:48:07 crc kubenswrapper[4867]: I1201 09:48:07.071056 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp\" (UID: \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" Dec 01 09:48:07 crc kubenswrapper[4867]: I1201 09:48:07.072332 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp\" (UID: \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" Dec 01 09:48:07 crc kubenswrapper[4867]: I1201 09:48:07.072525 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp\" (UID: \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" Dec 01 09:48:07 crc kubenswrapper[4867]: I1201 09:48:07.087980 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvjtn\" (UniqueName: \"kubernetes.io/projected/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-kube-api-access-wvjtn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp\" (UID: \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" Dec 01 09:48:07 crc kubenswrapper[4867]: I1201 09:48:07.179114 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" Dec 01 09:48:07 crc kubenswrapper[4867]: I1201 09:48:07.738563 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp"] Dec 01 09:48:07 crc kubenswrapper[4867]: I1201 09:48:07.753556 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" event={"ID":"3e0cef16-29f1-49cf-aee1-7c5d9963aa81","Type":"ContainerStarted","Data":"6b84d22be0c066a3fb461c165de945835f542544b087a711ac12424e305f9915"} Dec 01 09:48:08 crc kubenswrapper[4867]: I1201 09:48:08.767989 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" event={"ID":"3e0cef16-29f1-49cf-aee1-7c5d9963aa81","Type":"ContainerStarted","Data":"92bbabfd429597fc61b60c0fa14f32a57155af813b281c2755ddb2b7c8d40283"} Dec 01 09:48:08 crc kubenswrapper[4867]: I1201 09:48:08.790553 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" podStartSLOduration=2.497224215 podStartE2EDuration="2.790532347s" podCreationTimestamp="2025-12-01 09:48:06 +0000 UTC" firstStartedPulling="2025-12-01 09:48:07.73901215 +0000 UTC m=+2409.198398904" lastFinishedPulling="2025-12-01 09:48:08.032320282 +0000 UTC m=+2409.491707036" observedRunningTime="2025-12-01 09:48:08.786645311 +0000 UTC m=+2410.246032075" watchObservedRunningTime="2025-12-01 09:48:08.790532347 +0000 UTC m=+2410.249919101" Dec 01 09:48:14 crc kubenswrapper[4867]: I1201 09:48:14.827241 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:48:14 crc kubenswrapper[4867]: E1201 09:48:14.828110 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:48:29 crc kubenswrapper[4867]: I1201 09:48:29.827390 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:48:29 crc kubenswrapper[4867]: E1201 09:48:29.828177 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:48:42 crc kubenswrapper[4867]: I1201 09:48:42.827857 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:48:42 crc kubenswrapper[4867]: E1201 09:48:42.828632 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:48:54 crc kubenswrapper[4867]: I1201 09:48:54.827128 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:48:54 crc kubenswrapper[4867]: E1201 09:48:54.827976 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:49:07 crc kubenswrapper[4867]: I1201 09:49:07.827537 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:49:07 crc kubenswrapper[4867]: E1201 09:49:07.829922 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:49:21 crc kubenswrapper[4867]: I1201 09:49:21.826979 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:49:21 crc kubenswrapper[4867]: E1201 09:49:21.827762 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:49:36 crc kubenswrapper[4867]: I1201 09:49:36.827790 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:49:36 crc kubenswrapper[4867]: E1201 09:49:36.829101 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:49:50 crc kubenswrapper[4867]: I1201 09:49:50.831181 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:49:50 crc kubenswrapper[4867]: E1201 09:49:50.832230 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:50:02 crc kubenswrapper[4867]: I1201 09:50:02.827883 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:50:02 crc kubenswrapper[4867]: E1201 09:50:02.828709 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:50:15 crc kubenswrapper[4867]: I1201 09:50:15.827914 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:50:15 crc kubenswrapper[4867]: E1201 09:50:15.829148 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:50:28 crc kubenswrapper[4867]: I1201 09:50:28.835047 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:50:29 crc kubenswrapper[4867]: I1201 09:50:29.536651 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"dbc913c646b56f27ce8d54433b201384f1916f6fc4508d536119f230487c7618"} Dec 01 09:52:51 crc kubenswrapper[4867]: I1201 09:52:51.601979 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:52:51 crc kubenswrapper[4867]: I1201 09:52:51.603106 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:53:15 crc kubenswrapper[4867]: I1201 09:53:15.979735 4867 generic.go:334] "Generic (PLEG): container finished" podID="3e0cef16-29f1-49cf-aee1-7c5d9963aa81" containerID="92bbabfd429597fc61b60c0fa14f32a57155af813b281c2755ddb2b7c8d40283" exitCode=0 Dec 01 09:53:15 crc kubenswrapper[4867]: I1201 09:53:15.980385 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" event={"ID":"3e0cef16-29f1-49cf-aee1-7c5d9963aa81","Type":"ContainerDied","Data":"92bbabfd429597fc61b60c0fa14f32a57155af813b281c2755ddb2b7c8d40283"} Dec 01 09:53:17 crc kubenswrapper[4867]: I1201 09:53:17.386317 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" Dec 01 09:53:17 crc kubenswrapper[4867]: I1201 09:53:17.519937 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvjtn\" (UniqueName: \"kubernetes.io/projected/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-kube-api-access-wvjtn\") pod \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\" (UID: \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\") " Dec 01 09:53:17 crc kubenswrapper[4867]: I1201 09:53:17.520355 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-libvirt-combined-ca-bundle\") pod \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\" (UID: \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\") " Dec 01 09:53:17 crc kubenswrapper[4867]: I1201 09:53:17.520444 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-libvirt-secret-0\") pod \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\" (UID: \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\") " Dec 01 09:53:17 crc kubenswrapper[4867]: I1201 09:53:17.520493 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-ssh-key\") pod \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\" (UID: \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\") " Dec 01 09:53:17 crc kubenswrapper[4867]: I1201 09:53:17.520551 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-inventory\") pod \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\" (UID: \"3e0cef16-29f1-49cf-aee1-7c5d9963aa81\") " Dec 01 09:53:17 crc kubenswrapper[4867]: I1201 09:53:17.542517 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-kube-api-access-wvjtn" (OuterVolumeSpecName: "kube-api-access-wvjtn") pod "3e0cef16-29f1-49cf-aee1-7c5d9963aa81" (UID: "3e0cef16-29f1-49cf-aee1-7c5d9963aa81"). InnerVolumeSpecName "kube-api-access-wvjtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:53:17 crc kubenswrapper[4867]: I1201 09:53:17.542590 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "3e0cef16-29f1-49cf-aee1-7c5d9963aa81" (UID: "3e0cef16-29f1-49cf-aee1-7c5d9963aa81"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:53:17 crc kubenswrapper[4867]: I1201 09:53:17.552310 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-inventory" (OuterVolumeSpecName: "inventory") pod "3e0cef16-29f1-49cf-aee1-7c5d9963aa81" (UID: "3e0cef16-29f1-49cf-aee1-7c5d9963aa81"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:53:17 crc kubenswrapper[4867]: I1201 09:53:17.555181 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3e0cef16-29f1-49cf-aee1-7c5d9963aa81" (UID: "3e0cef16-29f1-49cf-aee1-7c5d9963aa81"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:53:17 crc kubenswrapper[4867]: I1201 09:53:17.574888 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "3e0cef16-29f1-49cf-aee1-7c5d9963aa81" (UID: "3e0cef16-29f1-49cf-aee1-7c5d9963aa81"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:53:17 crc kubenswrapper[4867]: I1201 09:53:17.622395 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvjtn\" (UniqueName: \"kubernetes.io/projected/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-kube-api-access-wvjtn\") on node \"crc\" DevicePath \"\"" Dec 01 09:53:17 crc kubenswrapper[4867]: I1201 09:53:17.622437 4867 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:53:17 crc kubenswrapper[4867]: I1201 09:53:17.622448 4867 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:53:17 crc kubenswrapper[4867]: I1201 09:53:17.622457 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:53:17 crc kubenswrapper[4867]: I1201 09:53:17.622469 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e0cef16-29f1-49cf-aee1-7c5d9963aa81-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:53:17 crc kubenswrapper[4867]: I1201 09:53:17.998152 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" event={"ID":"3e0cef16-29f1-49cf-aee1-7c5d9963aa81","Type":"ContainerDied","Data":"6b84d22be0c066a3fb461c165de945835f542544b087a711ac12424e305f9915"} Dec 01 09:53:17 crc kubenswrapper[4867]: I1201 09:53:17.998189 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b84d22be0c066a3fb461c165de945835f542544b087a711ac12424e305f9915" Dec 01 09:53:17 crc kubenswrapper[4867]: I1201 09:53:17.998211 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.154063 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6"] Dec 01 09:53:18 crc kubenswrapper[4867]: E1201 09:53:18.154569 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e0cef16-29f1-49cf-aee1-7c5d9963aa81" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.154590 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e0cef16-29f1-49cf-aee1-7c5d9963aa81" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.154873 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e0cef16-29f1-49cf-aee1-7c5d9963aa81" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.155647 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.157390 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.157634 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zvcpg" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.157859 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.158063 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.158205 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.159435 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.159654 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.205576 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6"] Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.336687 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjqjt\" (UniqueName: \"kubernetes.io/projected/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-kube-api-access-rjqjt\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.336981 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.337091 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.337340 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.337859 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.337982 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.338003 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.338033 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.338091 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.440142 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.440276 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.440316 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.440357 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.440420 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.440476 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjqjt\" (UniqueName: \"kubernetes.io/projected/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-kube-api-access-rjqjt\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.440514 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.440589 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.440661 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.442207 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.446455 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.446526 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.447602 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.447719 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.448128 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.449703 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.449799 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.459615 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjqjt\" (UniqueName: \"kubernetes.io/projected/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-kube-api-access-rjqjt\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lmcp6\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:18 crc kubenswrapper[4867]: I1201 09:53:18.520418 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:53:19 crc kubenswrapper[4867]: I1201 09:53:19.047017 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:53:19 crc kubenswrapper[4867]: I1201 09:53:19.051666 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6"] Dec 01 09:53:20 crc kubenswrapper[4867]: I1201 09:53:20.019913 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" event={"ID":"25628db2-c71e-4e6e-bfa2-d753bfc7fb89","Type":"ContainerStarted","Data":"c19f31d142ff6525acd01752561d519afe3874c0ae276478fd06332e373a2292"} Dec 01 09:53:20 crc kubenswrapper[4867]: I1201 09:53:20.020298 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" event={"ID":"25628db2-c71e-4e6e-bfa2-d753bfc7fb89","Type":"ContainerStarted","Data":"c4a57fe053167ba2a3fe286efbc93d30ca97678d1c2310b4db294763ed08656d"} Dec 01 09:53:20 crc kubenswrapper[4867]: I1201 09:53:20.044563 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" podStartSLOduration=1.853381055 podStartE2EDuration="2.044541385s" podCreationTimestamp="2025-12-01 09:53:18 +0000 UTC" firstStartedPulling="2025-12-01 09:53:19.046830883 +0000 UTC m=+2720.506217637" lastFinishedPulling="2025-12-01 09:53:19.237991213 +0000 UTC m=+2720.697377967" observedRunningTime="2025-12-01 09:53:20.043543068 +0000 UTC m=+2721.502929842" watchObservedRunningTime="2025-12-01 09:53:20.044541385 +0000 UTC m=+2721.503928159" Dec 01 09:53:21 crc kubenswrapper[4867]: I1201 09:53:21.600992 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:53:21 crc kubenswrapper[4867]: I1201 09:53:21.601403 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:53:51 crc kubenswrapper[4867]: I1201 09:53:51.601351 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:53:51 crc kubenswrapper[4867]: I1201 09:53:51.602097 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:53:51 crc kubenswrapper[4867]: I1201 09:53:51.602148 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:53:51 crc kubenswrapper[4867]: I1201 09:53:51.602970 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dbc913c646b56f27ce8d54433b201384f1916f6fc4508d536119f230487c7618"} pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:53:51 crc kubenswrapper[4867]: I1201 09:53:51.603026 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" containerID="cri-o://dbc913c646b56f27ce8d54433b201384f1916f6fc4508d536119f230487c7618" gracePeriod=600 Dec 01 09:53:52 crc kubenswrapper[4867]: I1201 09:53:52.297411 4867 generic.go:334] "Generic (PLEG): container finished" podID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerID="dbc913c646b56f27ce8d54433b201384f1916f6fc4508d536119f230487c7618" exitCode=0 Dec 01 09:53:52 crc kubenswrapper[4867]: I1201 09:53:52.297485 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerDied","Data":"dbc913c646b56f27ce8d54433b201384f1916f6fc4508d536119f230487c7618"} Dec 01 09:53:52 crc kubenswrapper[4867]: I1201 09:53:52.298039 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8"} Dec 01 09:53:52 crc kubenswrapper[4867]: I1201 09:53:52.298062 4867 scope.go:117] "RemoveContainer" containerID="2e0d7b15bcd8570b9df7d47ac7530d4fe3806f8ab48fe72040e7bc013233a9cb" Dec 01 09:54:15 crc kubenswrapper[4867]: I1201 09:54:15.044543 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4p5sr"] Dec 01 09:54:15 crc kubenswrapper[4867]: I1201 09:54:15.048105 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4p5sr" Dec 01 09:54:15 crc kubenswrapper[4867]: I1201 09:54:15.058785 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4p5sr"] Dec 01 09:54:15 crc kubenswrapper[4867]: I1201 09:54:15.183732 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9rwb\" (UniqueName: \"kubernetes.io/projected/d7e16372-4aa6-4464-8ed1-e05b395ef41e-kube-api-access-p9rwb\") pod \"certified-operators-4p5sr\" (UID: \"d7e16372-4aa6-4464-8ed1-e05b395ef41e\") " pod="openshift-marketplace/certified-operators-4p5sr" Dec 01 09:54:15 crc kubenswrapper[4867]: I1201 09:54:15.183867 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7e16372-4aa6-4464-8ed1-e05b395ef41e-utilities\") pod \"certified-operators-4p5sr\" (UID: \"d7e16372-4aa6-4464-8ed1-e05b395ef41e\") " pod="openshift-marketplace/certified-operators-4p5sr" Dec 01 09:54:15 crc kubenswrapper[4867]: I1201 09:54:15.183911 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7e16372-4aa6-4464-8ed1-e05b395ef41e-catalog-content\") pod \"certified-operators-4p5sr\" (UID: \"d7e16372-4aa6-4464-8ed1-e05b395ef41e\") " pod="openshift-marketplace/certified-operators-4p5sr" Dec 01 09:54:15 crc kubenswrapper[4867]: I1201 09:54:15.286123 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7e16372-4aa6-4464-8ed1-e05b395ef41e-utilities\") pod \"certified-operators-4p5sr\" (UID: \"d7e16372-4aa6-4464-8ed1-e05b395ef41e\") " pod="openshift-marketplace/certified-operators-4p5sr" Dec 01 09:54:15 crc kubenswrapper[4867]: I1201 09:54:15.286203 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7e16372-4aa6-4464-8ed1-e05b395ef41e-catalog-content\") pod \"certified-operators-4p5sr\" (UID: \"d7e16372-4aa6-4464-8ed1-e05b395ef41e\") " pod="openshift-marketplace/certified-operators-4p5sr" Dec 01 09:54:15 crc kubenswrapper[4867]: I1201 09:54:15.286281 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9rwb\" (UniqueName: \"kubernetes.io/projected/d7e16372-4aa6-4464-8ed1-e05b395ef41e-kube-api-access-p9rwb\") pod \"certified-operators-4p5sr\" (UID: \"d7e16372-4aa6-4464-8ed1-e05b395ef41e\") " pod="openshift-marketplace/certified-operators-4p5sr" Dec 01 09:54:15 crc kubenswrapper[4867]: I1201 09:54:15.286614 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7e16372-4aa6-4464-8ed1-e05b395ef41e-utilities\") pod \"certified-operators-4p5sr\" (UID: \"d7e16372-4aa6-4464-8ed1-e05b395ef41e\") " pod="openshift-marketplace/certified-operators-4p5sr" Dec 01 09:54:15 crc kubenswrapper[4867]: I1201 09:54:15.286705 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7e16372-4aa6-4464-8ed1-e05b395ef41e-catalog-content\") pod \"certified-operators-4p5sr\" (UID: \"d7e16372-4aa6-4464-8ed1-e05b395ef41e\") " pod="openshift-marketplace/certified-operators-4p5sr" Dec 01 09:54:15 crc kubenswrapper[4867]: I1201 09:54:15.308867 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9rwb\" (UniqueName: \"kubernetes.io/projected/d7e16372-4aa6-4464-8ed1-e05b395ef41e-kube-api-access-p9rwb\") pod \"certified-operators-4p5sr\" (UID: \"d7e16372-4aa6-4464-8ed1-e05b395ef41e\") " pod="openshift-marketplace/certified-operators-4p5sr" Dec 01 09:54:15 crc kubenswrapper[4867]: I1201 09:54:15.369747 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4p5sr" Dec 01 09:54:16 crc kubenswrapper[4867]: I1201 09:54:15.999142 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4p5sr"] Dec 01 09:54:16 crc kubenswrapper[4867]: I1201 09:54:16.585799 4867 generic.go:334] "Generic (PLEG): container finished" podID="d7e16372-4aa6-4464-8ed1-e05b395ef41e" containerID="9cde130690ad702a0bf054efcf2aa5345b9d8aa65c9c77318cc1bfb37aaff858" exitCode=0 Dec 01 09:54:16 crc kubenswrapper[4867]: I1201 09:54:16.585925 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4p5sr" event={"ID":"d7e16372-4aa6-4464-8ed1-e05b395ef41e","Type":"ContainerDied","Data":"9cde130690ad702a0bf054efcf2aa5345b9d8aa65c9c77318cc1bfb37aaff858"} Dec 01 09:54:16 crc kubenswrapper[4867]: I1201 09:54:16.586862 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4p5sr" event={"ID":"d7e16372-4aa6-4464-8ed1-e05b395ef41e","Type":"ContainerStarted","Data":"1bcde2d00306561cbe3ad6878d9c7a9bed05459ba948327426a67533edffbabb"} Dec 01 09:54:17 crc kubenswrapper[4867]: I1201 09:54:17.598512 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4p5sr" event={"ID":"d7e16372-4aa6-4464-8ed1-e05b395ef41e","Type":"ContainerStarted","Data":"72b0378bc2f8b0d73b43c83cf4eb977a5a4f302db7d56d03deeb2eab0817a462"} Dec 01 09:54:19 crc kubenswrapper[4867]: I1201 09:54:19.973245 4867 generic.go:334] "Generic (PLEG): container finished" podID="d7e16372-4aa6-4464-8ed1-e05b395ef41e" containerID="72b0378bc2f8b0d73b43c83cf4eb977a5a4f302db7d56d03deeb2eab0817a462" exitCode=0 Dec 01 09:54:19 crc kubenswrapper[4867]: I1201 09:54:19.973351 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4p5sr" event={"ID":"d7e16372-4aa6-4464-8ed1-e05b395ef41e","Type":"ContainerDied","Data":"72b0378bc2f8b0d73b43c83cf4eb977a5a4f302db7d56d03deeb2eab0817a462"} Dec 01 09:54:20 crc kubenswrapper[4867]: I1201 09:54:20.988517 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4p5sr" event={"ID":"d7e16372-4aa6-4464-8ed1-e05b395ef41e","Type":"ContainerStarted","Data":"cff53bce59477d67e667e29cf9e490856ef2f27db4d0ed6a11d4f14d5e1e1aa7"} Dec 01 09:54:21 crc kubenswrapper[4867]: I1201 09:54:21.014078 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4p5sr" podStartSLOduration=2.21556217 podStartE2EDuration="6.014057696s" podCreationTimestamp="2025-12-01 09:54:15 +0000 UTC" firstStartedPulling="2025-12-01 09:54:16.587926264 +0000 UTC m=+2778.047313018" lastFinishedPulling="2025-12-01 09:54:20.3864218 +0000 UTC m=+2781.845808544" observedRunningTime="2025-12-01 09:54:21.011438165 +0000 UTC m=+2782.470824929" watchObservedRunningTime="2025-12-01 09:54:21.014057696 +0000 UTC m=+2782.473444450" Dec 01 09:54:25 crc kubenswrapper[4867]: I1201 09:54:25.370293 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4p5sr" Dec 01 09:54:25 crc kubenswrapper[4867]: I1201 09:54:25.370725 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4p5sr" Dec 01 09:54:25 crc kubenswrapper[4867]: I1201 09:54:25.428135 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4p5sr" Dec 01 09:54:26 crc kubenswrapper[4867]: I1201 09:54:26.100618 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4p5sr" Dec 01 09:54:26 crc kubenswrapper[4867]: I1201 09:54:26.156775 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4p5sr"] Dec 01 09:54:28 crc kubenswrapper[4867]: I1201 09:54:28.050952 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4p5sr" podUID="d7e16372-4aa6-4464-8ed1-e05b395ef41e" containerName="registry-server" containerID="cri-o://cff53bce59477d67e667e29cf9e490856ef2f27db4d0ed6a11d4f14d5e1e1aa7" gracePeriod=2 Dec 01 09:54:28 crc kubenswrapper[4867]: I1201 09:54:28.528669 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4p5sr" Dec 01 09:54:28 crc kubenswrapper[4867]: I1201 09:54:28.558904 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7e16372-4aa6-4464-8ed1-e05b395ef41e-catalog-content\") pod \"d7e16372-4aa6-4464-8ed1-e05b395ef41e\" (UID: \"d7e16372-4aa6-4464-8ed1-e05b395ef41e\") " Dec 01 09:54:28 crc kubenswrapper[4867]: I1201 09:54:28.558962 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9rwb\" (UniqueName: \"kubernetes.io/projected/d7e16372-4aa6-4464-8ed1-e05b395ef41e-kube-api-access-p9rwb\") pod \"d7e16372-4aa6-4464-8ed1-e05b395ef41e\" (UID: \"d7e16372-4aa6-4464-8ed1-e05b395ef41e\") " Dec 01 09:54:28 crc kubenswrapper[4867]: I1201 09:54:28.559034 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7e16372-4aa6-4464-8ed1-e05b395ef41e-utilities\") pod \"d7e16372-4aa6-4464-8ed1-e05b395ef41e\" (UID: \"d7e16372-4aa6-4464-8ed1-e05b395ef41e\") " Dec 01 09:54:28 crc kubenswrapper[4867]: I1201 09:54:28.559572 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7e16372-4aa6-4464-8ed1-e05b395ef41e-utilities" (OuterVolumeSpecName: "utilities") pod "d7e16372-4aa6-4464-8ed1-e05b395ef41e" (UID: "d7e16372-4aa6-4464-8ed1-e05b395ef41e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:54:28 crc kubenswrapper[4867]: I1201 09:54:28.559995 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7e16372-4aa6-4464-8ed1-e05b395ef41e-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:54:28 crc kubenswrapper[4867]: I1201 09:54:28.567069 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7e16372-4aa6-4464-8ed1-e05b395ef41e-kube-api-access-p9rwb" (OuterVolumeSpecName: "kube-api-access-p9rwb") pod "d7e16372-4aa6-4464-8ed1-e05b395ef41e" (UID: "d7e16372-4aa6-4464-8ed1-e05b395ef41e"). InnerVolumeSpecName "kube-api-access-p9rwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:54:28 crc kubenswrapper[4867]: I1201 09:54:28.621342 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7e16372-4aa6-4464-8ed1-e05b395ef41e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7e16372-4aa6-4464-8ed1-e05b395ef41e" (UID: "d7e16372-4aa6-4464-8ed1-e05b395ef41e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:54:28 crc kubenswrapper[4867]: I1201 09:54:28.662105 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7e16372-4aa6-4464-8ed1-e05b395ef41e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:54:28 crc kubenswrapper[4867]: I1201 09:54:28.662146 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9rwb\" (UniqueName: \"kubernetes.io/projected/d7e16372-4aa6-4464-8ed1-e05b395ef41e-kube-api-access-p9rwb\") on node \"crc\" DevicePath \"\"" Dec 01 09:54:29 crc kubenswrapper[4867]: E1201 09:54:29.008347 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7e16372_4aa6_4464_8ed1_e05b395ef41e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7e16372_4aa6_4464_8ed1_e05b395ef41e.slice/crio-1bcde2d00306561cbe3ad6878d9c7a9bed05459ba948327426a67533edffbabb\": RecentStats: unable to find data in memory cache]" Dec 01 09:54:29 crc kubenswrapper[4867]: I1201 09:54:29.064730 4867 generic.go:334] "Generic (PLEG): container finished" podID="d7e16372-4aa6-4464-8ed1-e05b395ef41e" containerID="cff53bce59477d67e667e29cf9e490856ef2f27db4d0ed6a11d4f14d5e1e1aa7" exitCode=0 Dec 01 09:54:29 crc kubenswrapper[4867]: I1201 09:54:29.064858 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4p5sr" event={"ID":"d7e16372-4aa6-4464-8ed1-e05b395ef41e","Type":"ContainerDied","Data":"cff53bce59477d67e667e29cf9e490856ef2f27db4d0ed6a11d4f14d5e1e1aa7"} Dec 01 09:54:29 crc kubenswrapper[4867]: I1201 09:54:29.065078 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4p5sr" event={"ID":"d7e16372-4aa6-4464-8ed1-e05b395ef41e","Type":"ContainerDied","Data":"1bcde2d00306561cbe3ad6878d9c7a9bed05459ba948327426a67533edffbabb"} Dec 01 09:54:29 crc kubenswrapper[4867]: I1201 09:54:29.065096 4867 scope.go:117] "RemoveContainer" containerID="cff53bce59477d67e667e29cf9e490856ef2f27db4d0ed6a11d4f14d5e1e1aa7" Dec 01 09:54:29 crc kubenswrapper[4867]: I1201 09:54:29.064914 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4p5sr" Dec 01 09:54:29 crc kubenswrapper[4867]: I1201 09:54:29.091300 4867 scope.go:117] "RemoveContainer" containerID="72b0378bc2f8b0d73b43c83cf4eb977a5a4f302db7d56d03deeb2eab0817a462" Dec 01 09:54:29 crc kubenswrapper[4867]: I1201 09:54:29.111687 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4p5sr"] Dec 01 09:54:29 crc kubenswrapper[4867]: I1201 09:54:29.121209 4867 scope.go:117] "RemoveContainer" containerID="9cde130690ad702a0bf054efcf2aa5345b9d8aa65c9c77318cc1bfb37aaff858" Dec 01 09:54:29 crc kubenswrapper[4867]: I1201 09:54:29.129536 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4p5sr"] Dec 01 09:54:29 crc kubenswrapper[4867]: I1201 09:54:29.187869 4867 scope.go:117] "RemoveContainer" containerID="cff53bce59477d67e667e29cf9e490856ef2f27db4d0ed6a11d4f14d5e1e1aa7" Dec 01 09:54:29 crc kubenswrapper[4867]: E1201 09:54:29.189434 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cff53bce59477d67e667e29cf9e490856ef2f27db4d0ed6a11d4f14d5e1e1aa7\": container with ID starting with cff53bce59477d67e667e29cf9e490856ef2f27db4d0ed6a11d4f14d5e1e1aa7 not found: ID does not exist" containerID="cff53bce59477d67e667e29cf9e490856ef2f27db4d0ed6a11d4f14d5e1e1aa7" Dec 01 09:54:29 crc kubenswrapper[4867]: I1201 09:54:29.189474 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cff53bce59477d67e667e29cf9e490856ef2f27db4d0ed6a11d4f14d5e1e1aa7"} err="failed to get container status \"cff53bce59477d67e667e29cf9e490856ef2f27db4d0ed6a11d4f14d5e1e1aa7\": rpc error: code = NotFound desc = could not find container \"cff53bce59477d67e667e29cf9e490856ef2f27db4d0ed6a11d4f14d5e1e1aa7\": container with ID starting with cff53bce59477d67e667e29cf9e490856ef2f27db4d0ed6a11d4f14d5e1e1aa7 not found: ID does not exist" Dec 01 09:54:29 crc kubenswrapper[4867]: I1201 09:54:29.189503 4867 scope.go:117] "RemoveContainer" containerID="72b0378bc2f8b0d73b43c83cf4eb977a5a4f302db7d56d03deeb2eab0817a462" Dec 01 09:54:29 crc kubenswrapper[4867]: E1201 09:54:29.190079 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72b0378bc2f8b0d73b43c83cf4eb977a5a4f302db7d56d03deeb2eab0817a462\": container with ID starting with 72b0378bc2f8b0d73b43c83cf4eb977a5a4f302db7d56d03deeb2eab0817a462 not found: ID does not exist" containerID="72b0378bc2f8b0d73b43c83cf4eb977a5a4f302db7d56d03deeb2eab0817a462" Dec 01 09:54:29 crc kubenswrapper[4867]: I1201 09:54:29.190110 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72b0378bc2f8b0d73b43c83cf4eb977a5a4f302db7d56d03deeb2eab0817a462"} err="failed to get container status \"72b0378bc2f8b0d73b43c83cf4eb977a5a4f302db7d56d03deeb2eab0817a462\": rpc error: code = NotFound desc = could not find container \"72b0378bc2f8b0d73b43c83cf4eb977a5a4f302db7d56d03deeb2eab0817a462\": container with ID starting with 72b0378bc2f8b0d73b43c83cf4eb977a5a4f302db7d56d03deeb2eab0817a462 not found: ID does not exist" Dec 01 09:54:29 crc kubenswrapper[4867]: I1201 09:54:29.190126 4867 scope.go:117] "RemoveContainer" containerID="9cde130690ad702a0bf054efcf2aa5345b9d8aa65c9c77318cc1bfb37aaff858" Dec 01 09:54:29 crc kubenswrapper[4867]: E1201 09:54:29.190457 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cde130690ad702a0bf054efcf2aa5345b9d8aa65c9c77318cc1bfb37aaff858\": container with ID starting with 9cde130690ad702a0bf054efcf2aa5345b9d8aa65c9c77318cc1bfb37aaff858 not found: ID does not exist" containerID="9cde130690ad702a0bf054efcf2aa5345b9d8aa65c9c77318cc1bfb37aaff858" Dec 01 09:54:29 crc kubenswrapper[4867]: I1201 09:54:29.190478 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cde130690ad702a0bf054efcf2aa5345b9d8aa65c9c77318cc1bfb37aaff858"} err="failed to get container status \"9cde130690ad702a0bf054efcf2aa5345b9d8aa65c9c77318cc1bfb37aaff858\": rpc error: code = NotFound desc = could not find container \"9cde130690ad702a0bf054efcf2aa5345b9d8aa65c9c77318cc1bfb37aaff858\": container with ID starting with 9cde130690ad702a0bf054efcf2aa5345b9d8aa65c9c77318cc1bfb37aaff858 not found: ID does not exist" Dec 01 09:54:30 crc kubenswrapper[4867]: I1201 09:54:30.838429 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7e16372-4aa6-4464-8ed1-e05b395ef41e" path="/var/lib/kubelet/pods/d7e16372-4aa6-4464-8ed1-e05b395ef41e/volumes" Dec 01 09:54:34 crc kubenswrapper[4867]: I1201 09:54:34.526798 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j9fj5"] Dec 01 09:54:34 crc kubenswrapper[4867]: E1201 09:54:34.528022 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e16372-4aa6-4464-8ed1-e05b395ef41e" containerName="extract-utilities" Dec 01 09:54:34 crc kubenswrapper[4867]: I1201 09:54:34.528052 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e16372-4aa6-4464-8ed1-e05b395ef41e" containerName="extract-utilities" Dec 01 09:54:34 crc kubenswrapper[4867]: E1201 09:54:34.528068 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e16372-4aa6-4464-8ed1-e05b395ef41e" containerName="extract-content" Dec 01 09:54:34 crc kubenswrapper[4867]: I1201 09:54:34.528076 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e16372-4aa6-4464-8ed1-e05b395ef41e" containerName="extract-content" Dec 01 09:54:34 crc kubenswrapper[4867]: E1201 09:54:34.528094 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e16372-4aa6-4464-8ed1-e05b395ef41e" containerName="registry-server" Dec 01 09:54:34 crc kubenswrapper[4867]: I1201 09:54:34.528100 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e16372-4aa6-4464-8ed1-e05b395ef41e" containerName="registry-server" Dec 01 09:54:34 crc kubenswrapper[4867]: I1201 09:54:34.528446 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7e16372-4aa6-4464-8ed1-e05b395ef41e" containerName="registry-server" Dec 01 09:54:34 crc kubenswrapper[4867]: I1201 09:54:34.530207 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9fj5" Dec 01 09:54:34 crc kubenswrapper[4867]: I1201 09:54:34.537043 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9fj5"] Dec 01 09:54:34 crc kubenswrapper[4867]: I1201 09:54:34.575720 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7mgn\" (UniqueName: \"kubernetes.io/projected/81630672-d4bd-4278-932a-e3799c2c6160-kube-api-access-m7mgn\") pod \"redhat-marketplace-j9fj5\" (UID: \"81630672-d4bd-4278-932a-e3799c2c6160\") " pod="openshift-marketplace/redhat-marketplace-j9fj5" Dec 01 09:54:34 crc kubenswrapper[4867]: I1201 09:54:34.575780 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81630672-d4bd-4278-932a-e3799c2c6160-utilities\") pod \"redhat-marketplace-j9fj5\" (UID: \"81630672-d4bd-4278-932a-e3799c2c6160\") " pod="openshift-marketplace/redhat-marketplace-j9fj5" Dec 01 09:54:34 crc kubenswrapper[4867]: I1201 09:54:34.575991 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81630672-d4bd-4278-932a-e3799c2c6160-catalog-content\") pod \"redhat-marketplace-j9fj5\" (UID: \"81630672-d4bd-4278-932a-e3799c2c6160\") " pod="openshift-marketplace/redhat-marketplace-j9fj5" Dec 01 09:54:34 crc kubenswrapper[4867]: I1201 09:54:34.678600 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7mgn\" (UniqueName: \"kubernetes.io/projected/81630672-d4bd-4278-932a-e3799c2c6160-kube-api-access-m7mgn\") pod \"redhat-marketplace-j9fj5\" (UID: \"81630672-d4bd-4278-932a-e3799c2c6160\") " pod="openshift-marketplace/redhat-marketplace-j9fj5" Dec 01 09:54:34 crc kubenswrapper[4867]: I1201 09:54:34.678681 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81630672-d4bd-4278-932a-e3799c2c6160-utilities\") pod \"redhat-marketplace-j9fj5\" (UID: \"81630672-d4bd-4278-932a-e3799c2c6160\") " pod="openshift-marketplace/redhat-marketplace-j9fj5" Dec 01 09:54:34 crc kubenswrapper[4867]: I1201 09:54:34.678731 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81630672-d4bd-4278-932a-e3799c2c6160-catalog-content\") pod \"redhat-marketplace-j9fj5\" (UID: \"81630672-d4bd-4278-932a-e3799c2c6160\") " pod="openshift-marketplace/redhat-marketplace-j9fj5" Dec 01 09:54:34 crc kubenswrapper[4867]: I1201 09:54:34.679238 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81630672-d4bd-4278-932a-e3799c2c6160-catalog-content\") pod \"redhat-marketplace-j9fj5\" (UID: \"81630672-d4bd-4278-932a-e3799c2c6160\") " pod="openshift-marketplace/redhat-marketplace-j9fj5" Dec 01 09:54:34 crc kubenswrapper[4867]: I1201 09:54:34.679303 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81630672-d4bd-4278-932a-e3799c2c6160-utilities\") pod \"redhat-marketplace-j9fj5\" (UID: \"81630672-d4bd-4278-932a-e3799c2c6160\") " pod="openshift-marketplace/redhat-marketplace-j9fj5" Dec 01 09:54:34 crc kubenswrapper[4867]: I1201 09:54:34.700801 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7mgn\" (UniqueName: \"kubernetes.io/projected/81630672-d4bd-4278-932a-e3799c2c6160-kube-api-access-m7mgn\") pod \"redhat-marketplace-j9fj5\" (UID: \"81630672-d4bd-4278-932a-e3799c2c6160\") " pod="openshift-marketplace/redhat-marketplace-j9fj5" Dec 01 09:54:34 crc kubenswrapper[4867]: I1201 09:54:34.864609 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9fj5" Dec 01 09:54:35 crc kubenswrapper[4867]: I1201 09:54:35.461176 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9fj5"] Dec 01 09:54:35 crc kubenswrapper[4867]: W1201 09:54:35.472482 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81630672_d4bd_4278_932a_e3799c2c6160.slice/crio-20eab8e784880d27ab8afe222f7c44085c26de67848502219b9985bc296fd364 WatchSource:0}: Error finding container 20eab8e784880d27ab8afe222f7c44085c26de67848502219b9985bc296fd364: Status 404 returned error can't find the container with id 20eab8e784880d27ab8afe222f7c44085c26de67848502219b9985bc296fd364 Dec 01 09:54:36 crc kubenswrapper[4867]: I1201 09:54:36.127339 4867 generic.go:334] "Generic (PLEG): container finished" podID="81630672-d4bd-4278-932a-e3799c2c6160" containerID="87bcc12bdda859a3bd44f4334218c30741b921c8933d9694c795540057bda96c" exitCode=0 Dec 01 09:54:36 crc kubenswrapper[4867]: I1201 09:54:36.127646 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9fj5" event={"ID":"81630672-d4bd-4278-932a-e3799c2c6160","Type":"ContainerDied","Data":"87bcc12bdda859a3bd44f4334218c30741b921c8933d9694c795540057bda96c"} Dec 01 09:54:36 crc kubenswrapper[4867]: I1201 09:54:36.127682 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9fj5" event={"ID":"81630672-d4bd-4278-932a-e3799c2c6160","Type":"ContainerStarted","Data":"20eab8e784880d27ab8afe222f7c44085c26de67848502219b9985bc296fd364"} Dec 01 09:54:37 crc kubenswrapper[4867]: I1201 09:54:37.137228 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9fj5" event={"ID":"81630672-d4bd-4278-932a-e3799c2c6160","Type":"ContainerStarted","Data":"191844a80d8a9c169c6a4fd6048064600773dd6b3c3136593bf81e21f559d543"} Dec 01 09:54:38 crc kubenswrapper[4867]: I1201 09:54:38.147030 4867 generic.go:334] "Generic (PLEG): container finished" podID="81630672-d4bd-4278-932a-e3799c2c6160" containerID="191844a80d8a9c169c6a4fd6048064600773dd6b3c3136593bf81e21f559d543" exitCode=0 Dec 01 09:54:38 crc kubenswrapper[4867]: I1201 09:54:38.147069 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9fj5" event={"ID":"81630672-d4bd-4278-932a-e3799c2c6160","Type":"ContainerDied","Data":"191844a80d8a9c169c6a4fd6048064600773dd6b3c3136593bf81e21f559d543"} Dec 01 09:54:40 crc kubenswrapper[4867]: I1201 09:54:40.171991 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9fj5" event={"ID":"81630672-d4bd-4278-932a-e3799c2c6160","Type":"ContainerStarted","Data":"7f89776712043e3b79cb82cda7abb675a02f902eefdd36474a3dbbfa9be0f5c6"} Dec 01 09:54:44 crc kubenswrapper[4867]: I1201 09:54:44.864975 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j9fj5" Dec 01 09:54:44 crc kubenswrapper[4867]: I1201 09:54:44.865405 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j9fj5" Dec 01 09:54:44 crc kubenswrapper[4867]: I1201 09:54:44.919385 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j9fj5" Dec 01 09:54:44 crc kubenswrapper[4867]: I1201 09:54:44.941684 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j9fj5" podStartSLOduration=7.961679243 podStartE2EDuration="10.941519855s" podCreationTimestamp="2025-12-01 09:54:34 +0000 UTC" firstStartedPulling="2025-12-01 09:54:36.12934119 +0000 UTC m=+2797.588727944" lastFinishedPulling="2025-12-01 09:54:39.109181792 +0000 UTC m=+2800.568568556" observedRunningTime="2025-12-01 09:54:40.198624079 +0000 UTC m=+2801.658010853" watchObservedRunningTime="2025-12-01 09:54:44.941519855 +0000 UTC m=+2806.400906609" Dec 01 09:54:45 crc kubenswrapper[4867]: I1201 09:54:45.263766 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j9fj5" Dec 01 09:54:46 crc kubenswrapper[4867]: I1201 09:54:46.156514 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9fj5"] Dec 01 09:54:47 crc kubenswrapper[4867]: I1201 09:54:47.229862 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j9fj5" podUID="81630672-d4bd-4278-932a-e3799c2c6160" containerName="registry-server" containerID="cri-o://7f89776712043e3b79cb82cda7abb675a02f902eefdd36474a3dbbfa9be0f5c6" gracePeriod=2 Dec 01 09:54:47 crc kubenswrapper[4867]: I1201 09:54:47.698021 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9fj5" Dec 01 09:54:47 crc kubenswrapper[4867]: I1201 09:54:47.739897 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81630672-d4bd-4278-932a-e3799c2c6160-utilities\") pod \"81630672-d4bd-4278-932a-e3799c2c6160\" (UID: \"81630672-d4bd-4278-932a-e3799c2c6160\") " Dec 01 09:54:47 crc kubenswrapper[4867]: I1201 09:54:47.739947 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81630672-d4bd-4278-932a-e3799c2c6160-catalog-content\") pod \"81630672-d4bd-4278-932a-e3799c2c6160\" (UID: \"81630672-d4bd-4278-932a-e3799c2c6160\") " Dec 01 09:54:47 crc kubenswrapper[4867]: I1201 09:54:47.740162 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7mgn\" (UniqueName: \"kubernetes.io/projected/81630672-d4bd-4278-932a-e3799c2c6160-kube-api-access-m7mgn\") pod \"81630672-d4bd-4278-932a-e3799c2c6160\" (UID: \"81630672-d4bd-4278-932a-e3799c2c6160\") " Dec 01 09:54:47 crc kubenswrapper[4867]: I1201 09:54:47.740710 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81630672-d4bd-4278-932a-e3799c2c6160-utilities" (OuterVolumeSpecName: "utilities") pod "81630672-d4bd-4278-932a-e3799c2c6160" (UID: "81630672-d4bd-4278-932a-e3799c2c6160"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:54:47 crc kubenswrapper[4867]: I1201 09:54:47.740862 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81630672-d4bd-4278-932a-e3799c2c6160-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:54:47 crc kubenswrapper[4867]: I1201 09:54:47.747025 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81630672-d4bd-4278-932a-e3799c2c6160-kube-api-access-m7mgn" (OuterVolumeSpecName: "kube-api-access-m7mgn") pod "81630672-d4bd-4278-932a-e3799c2c6160" (UID: "81630672-d4bd-4278-932a-e3799c2c6160"). InnerVolumeSpecName "kube-api-access-m7mgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:54:47 crc kubenswrapper[4867]: I1201 09:54:47.760791 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81630672-d4bd-4278-932a-e3799c2c6160-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81630672-d4bd-4278-932a-e3799c2c6160" (UID: "81630672-d4bd-4278-932a-e3799c2c6160"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:54:47 crc kubenswrapper[4867]: I1201 09:54:47.842920 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81630672-d4bd-4278-932a-e3799c2c6160-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:54:47 crc kubenswrapper[4867]: I1201 09:54:47.842950 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7mgn\" (UniqueName: \"kubernetes.io/projected/81630672-d4bd-4278-932a-e3799c2c6160-kube-api-access-m7mgn\") on node \"crc\" DevicePath \"\"" Dec 01 09:54:48 crc kubenswrapper[4867]: I1201 09:54:48.261450 4867 generic.go:334] "Generic (PLEG): container finished" podID="81630672-d4bd-4278-932a-e3799c2c6160" containerID="7f89776712043e3b79cb82cda7abb675a02f902eefdd36474a3dbbfa9be0f5c6" exitCode=0 Dec 01 09:54:48 crc kubenswrapper[4867]: I1201 09:54:48.261543 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9fj5" Dec 01 09:54:48 crc kubenswrapper[4867]: I1201 09:54:48.261573 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9fj5" event={"ID":"81630672-d4bd-4278-932a-e3799c2c6160","Type":"ContainerDied","Data":"7f89776712043e3b79cb82cda7abb675a02f902eefdd36474a3dbbfa9be0f5c6"} Dec 01 09:54:48 crc kubenswrapper[4867]: I1201 09:54:48.263071 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9fj5" event={"ID":"81630672-d4bd-4278-932a-e3799c2c6160","Type":"ContainerDied","Data":"20eab8e784880d27ab8afe222f7c44085c26de67848502219b9985bc296fd364"} Dec 01 09:54:48 crc kubenswrapper[4867]: I1201 09:54:48.263120 4867 scope.go:117] "RemoveContainer" containerID="7f89776712043e3b79cb82cda7abb675a02f902eefdd36474a3dbbfa9be0f5c6" Dec 01 09:54:48 crc kubenswrapper[4867]: I1201 09:54:48.299759 4867 scope.go:117] "RemoveContainer" containerID="191844a80d8a9c169c6a4fd6048064600773dd6b3c3136593bf81e21f559d543" Dec 01 09:54:48 crc kubenswrapper[4867]: I1201 09:54:48.302639 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9fj5"] Dec 01 09:54:48 crc kubenswrapper[4867]: I1201 09:54:48.316963 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9fj5"] Dec 01 09:54:48 crc kubenswrapper[4867]: I1201 09:54:48.329243 4867 scope.go:117] "RemoveContainer" containerID="87bcc12bdda859a3bd44f4334218c30741b921c8933d9694c795540057bda96c" Dec 01 09:54:48 crc kubenswrapper[4867]: I1201 09:54:48.389492 4867 scope.go:117] "RemoveContainer" containerID="7f89776712043e3b79cb82cda7abb675a02f902eefdd36474a3dbbfa9be0f5c6" Dec 01 09:54:48 crc kubenswrapper[4867]: E1201 09:54:48.392505 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f89776712043e3b79cb82cda7abb675a02f902eefdd36474a3dbbfa9be0f5c6\": container with ID starting with 7f89776712043e3b79cb82cda7abb675a02f902eefdd36474a3dbbfa9be0f5c6 not found: ID does not exist" containerID="7f89776712043e3b79cb82cda7abb675a02f902eefdd36474a3dbbfa9be0f5c6" Dec 01 09:54:48 crc kubenswrapper[4867]: I1201 09:54:48.392561 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f89776712043e3b79cb82cda7abb675a02f902eefdd36474a3dbbfa9be0f5c6"} err="failed to get container status \"7f89776712043e3b79cb82cda7abb675a02f902eefdd36474a3dbbfa9be0f5c6\": rpc error: code = NotFound desc = could not find container \"7f89776712043e3b79cb82cda7abb675a02f902eefdd36474a3dbbfa9be0f5c6\": container with ID starting with 7f89776712043e3b79cb82cda7abb675a02f902eefdd36474a3dbbfa9be0f5c6 not found: ID does not exist" Dec 01 09:54:48 crc kubenswrapper[4867]: I1201 09:54:48.392592 4867 scope.go:117] "RemoveContainer" containerID="191844a80d8a9c169c6a4fd6048064600773dd6b3c3136593bf81e21f559d543" Dec 01 09:54:48 crc kubenswrapper[4867]: E1201 09:54:48.393157 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"191844a80d8a9c169c6a4fd6048064600773dd6b3c3136593bf81e21f559d543\": container with ID starting with 191844a80d8a9c169c6a4fd6048064600773dd6b3c3136593bf81e21f559d543 not found: ID does not exist" containerID="191844a80d8a9c169c6a4fd6048064600773dd6b3c3136593bf81e21f559d543" Dec 01 09:54:48 crc kubenswrapper[4867]: I1201 09:54:48.393181 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"191844a80d8a9c169c6a4fd6048064600773dd6b3c3136593bf81e21f559d543"} err="failed to get container status \"191844a80d8a9c169c6a4fd6048064600773dd6b3c3136593bf81e21f559d543\": rpc error: code = NotFound desc = could not find container \"191844a80d8a9c169c6a4fd6048064600773dd6b3c3136593bf81e21f559d543\": container with ID starting with 191844a80d8a9c169c6a4fd6048064600773dd6b3c3136593bf81e21f559d543 not found: ID does not exist" Dec 01 09:54:48 crc kubenswrapper[4867]: I1201 09:54:48.393198 4867 scope.go:117] "RemoveContainer" containerID="87bcc12bdda859a3bd44f4334218c30741b921c8933d9694c795540057bda96c" Dec 01 09:54:48 crc kubenswrapper[4867]: E1201 09:54:48.393622 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87bcc12bdda859a3bd44f4334218c30741b921c8933d9694c795540057bda96c\": container with ID starting with 87bcc12bdda859a3bd44f4334218c30741b921c8933d9694c795540057bda96c not found: ID does not exist" containerID="87bcc12bdda859a3bd44f4334218c30741b921c8933d9694c795540057bda96c" Dec 01 09:54:48 crc kubenswrapper[4867]: I1201 09:54:48.393640 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87bcc12bdda859a3bd44f4334218c30741b921c8933d9694c795540057bda96c"} err="failed to get container status \"87bcc12bdda859a3bd44f4334218c30741b921c8933d9694c795540057bda96c\": rpc error: code = NotFound desc = could not find container \"87bcc12bdda859a3bd44f4334218c30741b921c8933d9694c795540057bda96c\": container with ID starting with 87bcc12bdda859a3bd44f4334218c30741b921c8933d9694c795540057bda96c not found: ID does not exist" Dec 01 09:54:48 crc kubenswrapper[4867]: I1201 09:54:48.837443 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81630672-d4bd-4278-932a-e3799c2c6160" path="/var/lib/kubelet/pods/81630672-d4bd-4278-932a-e3799c2c6160/volumes" Dec 01 09:55:06 crc kubenswrapper[4867]: I1201 09:55:06.379729 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qw2c5"] Dec 01 09:55:06 crc kubenswrapper[4867]: E1201 09:55:06.380643 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81630672-d4bd-4278-932a-e3799c2c6160" containerName="extract-utilities" Dec 01 09:55:06 crc kubenswrapper[4867]: I1201 09:55:06.380656 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="81630672-d4bd-4278-932a-e3799c2c6160" containerName="extract-utilities" Dec 01 09:55:06 crc kubenswrapper[4867]: E1201 09:55:06.380698 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81630672-d4bd-4278-932a-e3799c2c6160" containerName="registry-server" Dec 01 09:55:06 crc kubenswrapper[4867]: I1201 09:55:06.380705 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="81630672-d4bd-4278-932a-e3799c2c6160" containerName="registry-server" Dec 01 09:55:06 crc kubenswrapper[4867]: E1201 09:55:06.380724 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81630672-d4bd-4278-932a-e3799c2c6160" containerName="extract-content" Dec 01 09:55:06 crc kubenswrapper[4867]: I1201 09:55:06.380732 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="81630672-d4bd-4278-932a-e3799c2c6160" containerName="extract-content" Dec 01 09:55:06 crc kubenswrapper[4867]: I1201 09:55:06.380912 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="81630672-d4bd-4278-932a-e3799c2c6160" containerName="registry-server" Dec 01 09:55:06 crc kubenswrapper[4867]: I1201 09:55:06.382668 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qw2c5" Dec 01 09:55:06 crc kubenswrapper[4867]: I1201 09:55:06.403516 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qw2c5"] Dec 01 09:55:06 crc kubenswrapper[4867]: I1201 09:55:06.553486 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5c85ced-a46a-4b1c-b160-776670bd1ea9-catalog-content\") pod \"redhat-operators-qw2c5\" (UID: \"a5c85ced-a46a-4b1c-b160-776670bd1ea9\") " pod="openshift-marketplace/redhat-operators-qw2c5" Dec 01 09:55:06 crc kubenswrapper[4867]: I1201 09:55:06.553545 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j5lb\" (UniqueName: \"kubernetes.io/projected/a5c85ced-a46a-4b1c-b160-776670bd1ea9-kube-api-access-6j5lb\") pod \"redhat-operators-qw2c5\" (UID: \"a5c85ced-a46a-4b1c-b160-776670bd1ea9\") " pod="openshift-marketplace/redhat-operators-qw2c5" Dec 01 09:55:06 crc kubenswrapper[4867]: I1201 09:55:06.554059 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5c85ced-a46a-4b1c-b160-776670bd1ea9-utilities\") pod \"redhat-operators-qw2c5\" (UID: \"a5c85ced-a46a-4b1c-b160-776670bd1ea9\") " pod="openshift-marketplace/redhat-operators-qw2c5" Dec 01 09:55:06 crc kubenswrapper[4867]: I1201 09:55:06.656478 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j5lb\" (UniqueName: \"kubernetes.io/projected/a5c85ced-a46a-4b1c-b160-776670bd1ea9-kube-api-access-6j5lb\") pod \"redhat-operators-qw2c5\" (UID: \"a5c85ced-a46a-4b1c-b160-776670bd1ea9\") " pod="openshift-marketplace/redhat-operators-qw2c5" Dec 01 09:55:06 crc kubenswrapper[4867]: I1201 09:55:06.656676 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5c85ced-a46a-4b1c-b160-776670bd1ea9-utilities\") pod \"redhat-operators-qw2c5\" (UID: \"a5c85ced-a46a-4b1c-b160-776670bd1ea9\") " pod="openshift-marketplace/redhat-operators-qw2c5" Dec 01 09:55:06 crc kubenswrapper[4867]: I1201 09:55:06.656880 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5c85ced-a46a-4b1c-b160-776670bd1ea9-catalog-content\") pod \"redhat-operators-qw2c5\" (UID: \"a5c85ced-a46a-4b1c-b160-776670bd1ea9\") " pod="openshift-marketplace/redhat-operators-qw2c5" Dec 01 09:55:06 crc kubenswrapper[4867]: I1201 09:55:06.657229 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5c85ced-a46a-4b1c-b160-776670bd1ea9-catalog-content\") pod \"redhat-operators-qw2c5\" (UID: \"a5c85ced-a46a-4b1c-b160-776670bd1ea9\") " pod="openshift-marketplace/redhat-operators-qw2c5" Dec 01 09:55:06 crc kubenswrapper[4867]: I1201 09:55:06.657464 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5c85ced-a46a-4b1c-b160-776670bd1ea9-utilities\") pod \"redhat-operators-qw2c5\" (UID: \"a5c85ced-a46a-4b1c-b160-776670bd1ea9\") " pod="openshift-marketplace/redhat-operators-qw2c5" Dec 01 09:55:06 crc kubenswrapper[4867]: I1201 09:55:06.684782 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j5lb\" (UniqueName: \"kubernetes.io/projected/a5c85ced-a46a-4b1c-b160-776670bd1ea9-kube-api-access-6j5lb\") pod \"redhat-operators-qw2c5\" (UID: \"a5c85ced-a46a-4b1c-b160-776670bd1ea9\") " pod="openshift-marketplace/redhat-operators-qw2c5" Dec 01 09:55:06 crc kubenswrapper[4867]: I1201 09:55:06.720353 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qw2c5" Dec 01 09:55:07 crc kubenswrapper[4867]: I1201 09:55:07.151677 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qw2c5"] Dec 01 09:55:07 crc kubenswrapper[4867]: I1201 09:55:07.441376 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw2c5" event={"ID":"a5c85ced-a46a-4b1c-b160-776670bd1ea9","Type":"ContainerStarted","Data":"a783a366fde8391d9b222134802677d4ecfa18fd4c5365fc0ab4e8ebe3d46b3a"} Dec 01 09:55:08 crc kubenswrapper[4867]: I1201 09:55:08.453665 4867 generic.go:334] "Generic (PLEG): container finished" podID="a5c85ced-a46a-4b1c-b160-776670bd1ea9" containerID="b71e1b2a22cfe61c06622c97e2e3003dcd2cdf8392537d8fe883023476362f3d" exitCode=0 Dec 01 09:55:08 crc kubenswrapper[4867]: I1201 09:55:08.453748 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw2c5" event={"ID":"a5c85ced-a46a-4b1c-b160-776670bd1ea9","Type":"ContainerDied","Data":"b71e1b2a22cfe61c06622c97e2e3003dcd2cdf8392537d8fe883023476362f3d"} Dec 01 09:55:11 crc kubenswrapper[4867]: I1201 09:55:11.484753 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw2c5" event={"ID":"a5c85ced-a46a-4b1c-b160-776670bd1ea9","Type":"ContainerStarted","Data":"652415acdf2c4b604d55b0ef078f344b0610f5a00920f2690ba9962699ae69c3"} Dec 01 09:55:16 crc kubenswrapper[4867]: I1201 09:55:16.056046 4867 generic.go:334] "Generic (PLEG): container finished" podID="a5c85ced-a46a-4b1c-b160-776670bd1ea9" containerID="652415acdf2c4b604d55b0ef078f344b0610f5a00920f2690ba9962699ae69c3" exitCode=0 Dec 01 09:55:16 crc kubenswrapper[4867]: I1201 09:55:16.056127 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw2c5" event={"ID":"a5c85ced-a46a-4b1c-b160-776670bd1ea9","Type":"ContainerDied","Data":"652415acdf2c4b604d55b0ef078f344b0610f5a00920f2690ba9962699ae69c3"} Dec 01 09:55:17 crc kubenswrapper[4867]: I1201 09:55:17.070847 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw2c5" event={"ID":"a5c85ced-a46a-4b1c-b160-776670bd1ea9","Type":"ContainerStarted","Data":"e77eb03cf961af1de9b58f7d736bfa2b8d56cc055d0e43ef175202428727fc13"} Dec 01 09:55:17 crc kubenswrapper[4867]: I1201 09:55:17.101883 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qw2c5" podStartSLOduration=2.85427289 podStartE2EDuration="11.101859277s" podCreationTimestamp="2025-12-01 09:55:06 +0000 UTC" firstStartedPulling="2025-12-01 09:55:08.455509978 +0000 UTC m=+2829.914896732" lastFinishedPulling="2025-12-01 09:55:16.703096365 +0000 UTC m=+2838.162483119" observedRunningTime="2025-12-01 09:55:17.091167123 +0000 UTC m=+2838.550553887" watchObservedRunningTime="2025-12-01 09:55:17.101859277 +0000 UTC m=+2838.561246031" Dec 01 09:55:26 crc kubenswrapper[4867]: I1201 09:55:26.720827 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qw2c5" Dec 01 09:55:26 crc kubenswrapper[4867]: I1201 09:55:26.721372 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qw2c5" Dec 01 09:55:27 crc kubenswrapper[4867]: I1201 09:55:27.771920 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qw2c5" podUID="a5c85ced-a46a-4b1c-b160-776670bd1ea9" containerName="registry-server" probeResult="failure" output=< Dec 01 09:55:27 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Dec 01 09:55:27 crc kubenswrapper[4867]: > Dec 01 09:55:36 crc kubenswrapper[4867]: I1201 09:55:36.772948 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qw2c5" Dec 01 09:55:36 crc kubenswrapper[4867]: I1201 09:55:36.826142 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qw2c5" Dec 01 09:55:37 crc kubenswrapper[4867]: I1201 09:55:37.582471 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qw2c5"] Dec 01 09:55:38 crc kubenswrapper[4867]: I1201 09:55:38.246289 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qw2c5" podUID="a5c85ced-a46a-4b1c-b160-776670bd1ea9" containerName="registry-server" containerID="cri-o://e77eb03cf961af1de9b58f7d736bfa2b8d56cc055d0e43ef175202428727fc13" gracePeriod=2 Dec 01 09:55:38 crc kubenswrapper[4867]: I1201 09:55:38.656632 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qw2c5" Dec 01 09:55:38 crc kubenswrapper[4867]: I1201 09:55:38.824514 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5c85ced-a46a-4b1c-b160-776670bd1ea9-utilities\") pod \"a5c85ced-a46a-4b1c-b160-776670bd1ea9\" (UID: \"a5c85ced-a46a-4b1c-b160-776670bd1ea9\") " Dec 01 09:55:38 crc kubenswrapper[4867]: I1201 09:55:38.824598 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j5lb\" (UniqueName: \"kubernetes.io/projected/a5c85ced-a46a-4b1c-b160-776670bd1ea9-kube-api-access-6j5lb\") pod \"a5c85ced-a46a-4b1c-b160-776670bd1ea9\" (UID: \"a5c85ced-a46a-4b1c-b160-776670bd1ea9\") " Dec 01 09:55:38 crc kubenswrapper[4867]: I1201 09:55:38.824799 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5c85ced-a46a-4b1c-b160-776670bd1ea9-catalog-content\") pod \"a5c85ced-a46a-4b1c-b160-776670bd1ea9\" (UID: \"a5c85ced-a46a-4b1c-b160-776670bd1ea9\") " Dec 01 09:55:38 crc kubenswrapper[4867]: I1201 09:55:38.827089 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5c85ced-a46a-4b1c-b160-776670bd1ea9-utilities" (OuterVolumeSpecName: "utilities") pod "a5c85ced-a46a-4b1c-b160-776670bd1ea9" (UID: "a5c85ced-a46a-4b1c-b160-776670bd1ea9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:55:38 crc kubenswrapper[4867]: I1201 09:55:38.833462 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5c85ced-a46a-4b1c-b160-776670bd1ea9-kube-api-access-6j5lb" (OuterVolumeSpecName: "kube-api-access-6j5lb") pod "a5c85ced-a46a-4b1c-b160-776670bd1ea9" (UID: "a5c85ced-a46a-4b1c-b160-776670bd1ea9"). InnerVolumeSpecName "kube-api-access-6j5lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:55:38 crc kubenswrapper[4867]: I1201 09:55:38.924237 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5c85ced-a46a-4b1c-b160-776670bd1ea9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5c85ced-a46a-4b1c-b160-776670bd1ea9" (UID: "a5c85ced-a46a-4b1c-b160-776670bd1ea9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:55:38 crc kubenswrapper[4867]: I1201 09:55:38.928073 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5c85ced-a46a-4b1c-b160-776670bd1ea9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:55:38 crc kubenswrapper[4867]: I1201 09:55:38.928118 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5c85ced-a46a-4b1c-b160-776670bd1ea9-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:55:38 crc kubenswrapper[4867]: I1201 09:55:38.928129 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j5lb\" (UniqueName: \"kubernetes.io/projected/a5c85ced-a46a-4b1c-b160-776670bd1ea9-kube-api-access-6j5lb\") on node \"crc\" DevicePath \"\"" Dec 01 09:55:39 crc kubenswrapper[4867]: I1201 09:55:39.257351 4867 generic.go:334] "Generic (PLEG): container finished" podID="a5c85ced-a46a-4b1c-b160-776670bd1ea9" containerID="e77eb03cf961af1de9b58f7d736bfa2b8d56cc055d0e43ef175202428727fc13" exitCode=0 Dec 01 09:55:39 crc kubenswrapper[4867]: I1201 09:55:39.257408 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw2c5" event={"ID":"a5c85ced-a46a-4b1c-b160-776670bd1ea9","Type":"ContainerDied","Data":"e77eb03cf961af1de9b58f7d736bfa2b8d56cc055d0e43ef175202428727fc13"} Dec 01 09:55:39 crc kubenswrapper[4867]: I1201 09:55:39.257435 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qw2c5" Dec 01 09:55:39 crc kubenswrapper[4867]: I1201 09:55:39.257451 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw2c5" event={"ID":"a5c85ced-a46a-4b1c-b160-776670bd1ea9","Type":"ContainerDied","Data":"a783a366fde8391d9b222134802677d4ecfa18fd4c5365fc0ab4e8ebe3d46b3a"} Dec 01 09:55:39 crc kubenswrapper[4867]: I1201 09:55:39.257474 4867 scope.go:117] "RemoveContainer" containerID="e77eb03cf961af1de9b58f7d736bfa2b8d56cc055d0e43ef175202428727fc13" Dec 01 09:55:39 crc kubenswrapper[4867]: I1201 09:55:39.296931 4867 scope.go:117] "RemoveContainer" containerID="652415acdf2c4b604d55b0ef078f344b0610f5a00920f2690ba9962699ae69c3" Dec 01 09:55:39 crc kubenswrapper[4867]: I1201 09:55:39.299396 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qw2c5"] Dec 01 09:55:39 crc kubenswrapper[4867]: I1201 09:55:39.310905 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qw2c5"] Dec 01 09:55:39 crc kubenswrapper[4867]: I1201 09:55:39.336038 4867 scope.go:117] "RemoveContainer" containerID="b71e1b2a22cfe61c06622c97e2e3003dcd2cdf8392537d8fe883023476362f3d" Dec 01 09:55:39 crc kubenswrapper[4867]: I1201 09:55:39.386892 4867 scope.go:117] "RemoveContainer" containerID="e77eb03cf961af1de9b58f7d736bfa2b8d56cc055d0e43ef175202428727fc13" Dec 01 09:55:39 crc kubenswrapper[4867]: E1201 09:55:39.387509 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e77eb03cf961af1de9b58f7d736bfa2b8d56cc055d0e43ef175202428727fc13\": container with ID starting with e77eb03cf961af1de9b58f7d736bfa2b8d56cc055d0e43ef175202428727fc13 not found: ID does not exist" containerID="e77eb03cf961af1de9b58f7d736bfa2b8d56cc055d0e43ef175202428727fc13" Dec 01 09:55:39 crc kubenswrapper[4867]: I1201 09:55:39.387576 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e77eb03cf961af1de9b58f7d736bfa2b8d56cc055d0e43ef175202428727fc13"} err="failed to get container status \"e77eb03cf961af1de9b58f7d736bfa2b8d56cc055d0e43ef175202428727fc13\": rpc error: code = NotFound desc = could not find container \"e77eb03cf961af1de9b58f7d736bfa2b8d56cc055d0e43ef175202428727fc13\": container with ID starting with e77eb03cf961af1de9b58f7d736bfa2b8d56cc055d0e43ef175202428727fc13 not found: ID does not exist" Dec 01 09:55:39 crc kubenswrapper[4867]: I1201 09:55:39.387605 4867 scope.go:117] "RemoveContainer" containerID="652415acdf2c4b604d55b0ef078f344b0610f5a00920f2690ba9962699ae69c3" Dec 01 09:55:39 crc kubenswrapper[4867]: E1201 09:55:39.388622 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"652415acdf2c4b604d55b0ef078f344b0610f5a00920f2690ba9962699ae69c3\": container with ID starting with 652415acdf2c4b604d55b0ef078f344b0610f5a00920f2690ba9962699ae69c3 not found: ID does not exist" containerID="652415acdf2c4b604d55b0ef078f344b0610f5a00920f2690ba9962699ae69c3" Dec 01 09:55:39 crc kubenswrapper[4867]: I1201 09:55:39.388760 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"652415acdf2c4b604d55b0ef078f344b0610f5a00920f2690ba9962699ae69c3"} err="failed to get container status \"652415acdf2c4b604d55b0ef078f344b0610f5a00920f2690ba9962699ae69c3\": rpc error: code = NotFound desc = could not find container \"652415acdf2c4b604d55b0ef078f344b0610f5a00920f2690ba9962699ae69c3\": container with ID starting with 652415acdf2c4b604d55b0ef078f344b0610f5a00920f2690ba9962699ae69c3 not found: ID does not exist" Dec 01 09:55:39 crc kubenswrapper[4867]: I1201 09:55:39.388936 4867 scope.go:117] "RemoveContainer" containerID="b71e1b2a22cfe61c06622c97e2e3003dcd2cdf8392537d8fe883023476362f3d" Dec 01 09:55:39 crc kubenswrapper[4867]: E1201 09:55:39.389463 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b71e1b2a22cfe61c06622c97e2e3003dcd2cdf8392537d8fe883023476362f3d\": container with ID starting with b71e1b2a22cfe61c06622c97e2e3003dcd2cdf8392537d8fe883023476362f3d not found: ID does not exist" containerID="b71e1b2a22cfe61c06622c97e2e3003dcd2cdf8392537d8fe883023476362f3d" Dec 01 09:55:39 crc kubenswrapper[4867]: I1201 09:55:39.389524 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b71e1b2a22cfe61c06622c97e2e3003dcd2cdf8392537d8fe883023476362f3d"} err="failed to get container status \"b71e1b2a22cfe61c06622c97e2e3003dcd2cdf8392537d8fe883023476362f3d\": rpc error: code = NotFound desc = could not find container \"b71e1b2a22cfe61c06622c97e2e3003dcd2cdf8392537d8fe883023476362f3d\": container with ID starting with b71e1b2a22cfe61c06622c97e2e3003dcd2cdf8392537d8fe883023476362f3d not found: ID does not exist" Dec 01 09:55:40 crc kubenswrapper[4867]: I1201 09:55:40.838496 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5c85ced-a46a-4b1c-b160-776670bd1ea9" path="/var/lib/kubelet/pods/a5c85ced-a46a-4b1c-b160-776670bd1ea9/volumes" Dec 01 09:55:51 crc kubenswrapper[4867]: I1201 09:55:51.601775 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:55:51 crc kubenswrapper[4867]: I1201 09:55:51.602945 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:56:19 crc kubenswrapper[4867]: I1201 09:56:19.614335 4867 generic.go:334] "Generic (PLEG): container finished" podID="25628db2-c71e-4e6e-bfa2-d753bfc7fb89" containerID="c19f31d142ff6525acd01752561d519afe3874c0ae276478fd06332e373a2292" exitCode=0 Dec 01 09:56:19 crc kubenswrapper[4867]: I1201 09:56:19.614549 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" event={"ID":"25628db2-c71e-4e6e-bfa2-d753bfc7fb89","Type":"ContainerDied","Data":"c19f31d142ff6525acd01752561d519afe3874c0ae276478fd06332e373a2292"} Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.143741 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.194012 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-migration-ssh-key-0\") pod \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.194102 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-inventory\") pod \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.194132 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-migration-ssh-key-1\") pod \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.194280 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-cell1-compute-config-0\") pod \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.194315 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-ssh-key\") pod \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.194348 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-combined-ca-bundle\") pod \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.194406 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-cell1-compute-config-1\") pod \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.194439 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjqjt\" (UniqueName: \"kubernetes.io/projected/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-kube-api-access-rjqjt\") pod \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.194507 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-extra-config-0\") pod \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\" (UID: \"25628db2-c71e-4e6e-bfa2-d753bfc7fb89\") " Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.205697 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "25628db2-c71e-4e6e-bfa2-d753bfc7fb89" (UID: "25628db2-c71e-4e6e-bfa2-d753bfc7fb89"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.210657 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-kube-api-access-rjqjt" (OuterVolumeSpecName: "kube-api-access-rjqjt") pod "25628db2-c71e-4e6e-bfa2-d753bfc7fb89" (UID: "25628db2-c71e-4e6e-bfa2-d753bfc7fb89"). InnerVolumeSpecName "kube-api-access-rjqjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.226696 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "25628db2-c71e-4e6e-bfa2-d753bfc7fb89" (UID: "25628db2-c71e-4e6e-bfa2-d753bfc7fb89"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.228155 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "25628db2-c71e-4e6e-bfa2-d753bfc7fb89" (UID: "25628db2-c71e-4e6e-bfa2-d753bfc7fb89"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.231197 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "25628db2-c71e-4e6e-bfa2-d753bfc7fb89" (UID: "25628db2-c71e-4e6e-bfa2-d753bfc7fb89"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.231554 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "25628db2-c71e-4e6e-bfa2-d753bfc7fb89" (UID: "25628db2-c71e-4e6e-bfa2-d753bfc7fb89"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.237655 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "25628db2-c71e-4e6e-bfa2-d753bfc7fb89" (UID: "25628db2-c71e-4e6e-bfa2-d753bfc7fb89"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.242978 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "25628db2-c71e-4e6e-bfa2-d753bfc7fb89" (UID: "25628db2-c71e-4e6e-bfa2-d753bfc7fb89"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.252991 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-inventory" (OuterVolumeSpecName: "inventory") pod "25628db2-c71e-4e6e-bfa2-d753bfc7fb89" (UID: "25628db2-c71e-4e6e-bfa2-d753bfc7fb89"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.296781 4867 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.297202 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.297317 4867 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.297402 4867 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.297481 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjqjt\" (UniqueName: \"kubernetes.io/projected/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-kube-api-access-rjqjt\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.297572 4867 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.297726 4867 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.297977 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.298108 4867 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/25628db2-c71e-4e6e-bfa2-d753bfc7fb89-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.601036 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.601100 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.633918 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" event={"ID":"25628db2-c71e-4e6e-bfa2-d753bfc7fb89","Type":"ContainerDied","Data":"c4a57fe053167ba2a3fe286efbc93d30ca97678d1c2310b4db294763ed08656d"} Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.633970 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4a57fe053167ba2a3fe286efbc93d30ca97678d1c2310b4db294763ed08656d" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.633971 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lmcp6" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.772874 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6"] Dec 01 09:56:21 crc kubenswrapper[4867]: E1201 09:56:21.773400 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c85ced-a46a-4b1c-b160-776670bd1ea9" containerName="registry-server" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.773425 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c85ced-a46a-4b1c-b160-776670bd1ea9" containerName="registry-server" Dec 01 09:56:21 crc kubenswrapper[4867]: E1201 09:56:21.773470 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25628db2-c71e-4e6e-bfa2-d753bfc7fb89" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.773483 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="25628db2-c71e-4e6e-bfa2-d753bfc7fb89" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 01 09:56:21 crc kubenswrapper[4867]: E1201 09:56:21.773493 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c85ced-a46a-4b1c-b160-776670bd1ea9" containerName="extract-utilities" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.773502 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c85ced-a46a-4b1c-b160-776670bd1ea9" containerName="extract-utilities" Dec 01 09:56:21 crc kubenswrapper[4867]: E1201 09:56:21.773517 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c85ced-a46a-4b1c-b160-776670bd1ea9" containerName="extract-content" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.773522 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c85ced-a46a-4b1c-b160-776670bd1ea9" containerName="extract-content" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.773703 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="25628db2-c71e-4e6e-bfa2-d753bfc7fb89" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.773725 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5c85ced-a46a-4b1c-b160-776670bd1ea9" containerName="registry-server" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.774423 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.779300 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.779341 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.779363 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zvcpg" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.779325 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.779910 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.843565 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6"] Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.909152 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.909477 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.909554 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.909600 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4hbk\" (UniqueName: \"kubernetes.io/projected/8a874825-a4d4-446d-b1fe-3317e3b67d55-kube-api-access-j4hbk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.909624 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.909854 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:21 crc kubenswrapper[4867]: I1201 09:56:21.909931 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:22 crc kubenswrapper[4867]: I1201 09:56:22.011305 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:22 crc kubenswrapper[4867]: I1201 09:56:22.011412 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:22 crc kubenswrapper[4867]: I1201 09:56:22.011460 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4hbk\" (UniqueName: \"kubernetes.io/projected/8a874825-a4d4-446d-b1fe-3317e3b67d55-kube-api-access-j4hbk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:22 crc kubenswrapper[4867]: I1201 09:56:22.011492 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:22 crc kubenswrapper[4867]: I1201 09:56:22.011553 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:22 crc kubenswrapper[4867]: I1201 09:56:22.011583 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:22 crc kubenswrapper[4867]: I1201 09:56:22.011659 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:22 crc kubenswrapper[4867]: I1201 09:56:22.015534 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:22 crc kubenswrapper[4867]: I1201 09:56:22.016117 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:22 crc kubenswrapper[4867]: I1201 09:56:22.017730 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:22 crc kubenswrapper[4867]: I1201 09:56:22.018096 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:22 crc kubenswrapper[4867]: I1201 09:56:22.018428 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:22 crc kubenswrapper[4867]: I1201 09:56:22.019445 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:22 crc kubenswrapper[4867]: I1201 09:56:22.037978 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4hbk\" (UniqueName: \"kubernetes.io/projected/8a874825-a4d4-446d-b1fe-3317e3b67d55-kube-api-access-j4hbk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8df6\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:22 crc kubenswrapper[4867]: I1201 09:56:22.093006 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:56:22 crc kubenswrapper[4867]: I1201 09:56:22.630021 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6"] Dec 01 09:56:23 crc kubenswrapper[4867]: I1201 09:56:23.659767 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" event={"ID":"8a874825-a4d4-446d-b1fe-3317e3b67d55","Type":"ContainerStarted","Data":"71c314410504cf6030dd0ad473edbdf3df938f72e0891427e6983eb796129c28"} Dec 01 09:56:23 crc kubenswrapper[4867]: I1201 09:56:23.660392 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" event={"ID":"8a874825-a4d4-446d-b1fe-3317e3b67d55","Type":"ContainerStarted","Data":"8bca8f243742b293fa9b9f28735e1e2b1652c0de268b6caef81402c239d4f58c"} Dec 01 09:56:23 crc kubenswrapper[4867]: I1201 09:56:23.693318 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" podStartSLOduration=2.559586397 podStartE2EDuration="2.693288402s" podCreationTimestamp="2025-12-01 09:56:21 +0000 UTC" firstStartedPulling="2025-12-01 09:56:22.654100243 +0000 UTC m=+2904.113486997" lastFinishedPulling="2025-12-01 09:56:22.787802258 +0000 UTC m=+2904.247189002" observedRunningTime="2025-12-01 09:56:23.682047794 +0000 UTC m=+2905.141434548" watchObservedRunningTime="2025-12-01 09:56:23.693288402 +0000 UTC m=+2905.152675156" Dec 01 09:56:51 crc kubenswrapper[4867]: I1201 09:56:51.601260 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:56:51 crc kubenswrapper[4867]: I1201 09:56:51.601833 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:56:51 crc kubenswrapper[4867]: I1201 09:56:51.601893 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 09:56:51 crc kubenswrapper[4867]: I1201 09:56:51.602589 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8"} pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:56:51 crc kubenswrapper[4867]: I1201 09:56:51.602646 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" containerID="cri-o://3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" gracePeriod=600 Dec 01 09:56:51 crc kubenswrapper[4867]: E1201 09:56:51.726827 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:56:51 crc kubenswrapper[4867]: I1201 09:56:51.943416 4867 generic.go:334] "Generic (PLEG): container finished" podID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" exitCode=0 Dec 01 09:56:51 crc kubenswrapper[4867]: I1201 09:56:51.943457 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerDied","Data":"3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8"} Dec 01 09:56:51 crc kubenswrapper[4867]: I1201 09:56:51.943486 4867 scope.go:117] "RemoveContainer" containerID="dbc913c646b56f27ce8d54433b201384f1916f6fc4508d536119f230487c7618" Dec 01 09:56:51 crc kubenswrapper[4867]: I1201 09:56:51.944071 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 09:56:51 crc kubenswrapper[4867]: E1201 09:56:51.944301 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:56:53 crc kubenswrapper[4867]: I1201 09:56:53.421391 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5vv58"] Dec 01 09:56:53 crc kubenswrapper[4867]: I1201 09:56:53.423714 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vv58" Dec 01 09:56:53 crc kubenswrapper[4867]: I1201 09:56:53.438583 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5vv58"] Dec 01 09:56:53 crc kubenswrapper[4867]: I1201 09:56:53.518499 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85-catalog-content\") pod \"community-operators-5vv58\" (UID: \"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85\") " pod="openshift-marketplace/community-operators-5vv58" Dec 01 09:56:53 crc kubenswrapper[4867]: I1201 09:56:53.518573 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85-utilities\") pod \"community-operators-5vv58\" (UID: \"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85\") " pod="openshift-marketplace/community-operators-5vv58" Dec 01 09:56:53 crc kubenswrapper[4867]: I1201 09:56:53.518635 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdc89\" (UniqueName: \"kubernetes.io/projected/3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85-kube-api-access-hdc89\") pod \"community-operators-5vv58\" (UID: \"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85\") " pod="openshift-marketplace/community-operators-5vv58" Dec 01 09:56:53 crc kubenswrapper[4867]: I1201 09:56:53.620121 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdc89\" (UniqueName: \"kubernetes.io/projected/3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85-kube-api-access-hdc89\") pod \"community-operators-5vv58\" (UID: \"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85\") " pod="openshift-marketplace/community-operators-5vv58" Dec 01 09:56:53 crc kubenswrapper[4867]: I1201 09:56:53.620307 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85-catalog-content\") pod \"community-operators-5vv58\" (UID: \"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85\") " pod="openshift-marketplace/community-operators-5vv58" Dec 01 09:56:53 crc kubenswrapper[4867]: I1201 09:56:53.620382 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85-utilities\") pod \"community-operators-5vv58\" (UID: \"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85\") " pod="openshift-marketplace/community-operators-5vv58" Dec 01 09:56:53 crc kubenswrapper[4867]: I1201 09:56:53.620929 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85-catalog-content\") pod \"community-operators-5vv58\" (UID: \"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85\") " pod="openshift-marketplace/community-operators-5vv58" Dec 01 09:56:53 crc kubenswrapper[4867]: I1201 09:56:53.621001 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85-utilities\") pod \"community-operators-5vv58\" (UID: \"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85\") " pod="openshift-marketplace/community-operators-5vv58" Dec 01 09:56:53 crc kubenswrapper[4867]: I1201 09:56:53.648915 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdc89\" (UniqueName: \"kubernetes.io/projected/3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85-kube-api-access-hdc89\") pod \"community-operators-5vv58\" (UID: \"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85\") " pod="openshift-marketplace/community-operators-5vv58" Dec 01 09:56:53 crc kubenswrapper[4867]: I1201 09:56:53.743060 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vv58" Dec 01 09:56:54 crc kubenswrapper[4867]: I1201 09:56:54.325322 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5vv58"] Dec 01 09:56:54 crc kubenswrapper[4867]: I1201 09:56:54.989718 4867 generic.go:334] "Generic (PLEG): container finished" podID="3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85" containerID="82821539d4fea034c56729a5949d9b73ad7a56c133e1c9acd889b8d34c0036f6" exitCode=0 Dec 01 09:56:54 crc kubenswrapper[4867]: I1201 09:56:54.989833 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vv58" event={"ID":"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85","Type":"ContainerDied","Data":"82821539d4fea034c56729a5949d9b73ad7a56c133e1c9acd889b8d34c0036f6"} Dec 01 09:56:54 crc kubenswrapper[4867]: I1201 09:56:54.990116 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vv58" event={"ID":"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85","Type":"ContainerStarted","Data":"b7eb9a3111695b3f9a32a765f59f9f0269b917235da57d0f2b986b52dd4b189c"} Dec 01 09:56:57 crc kubenswrapper[4867]: I1201 09:56:57.010108 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vv58" event={"ID":"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85","Type":"ContainerStarted","Data":"2a4f66a85a349e2297061261cf9569aadc0e610e4d937812765468c664dd2f3f"} Dec 01 09:56:58 crc kubenswrapper[4867]: I1201 09:56:58.020240 4867 generic.go:334] "Generic (PLEG): container finished" podID="3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85" containerID="2a4f66a85a349e2297061261cf9569aadc0e610e4d937812765468c664dd2f3f" exitCode=0 Dec 01 09:56:58 crc kubenswrapper[4867]: I1201 09:56:58.020454 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vv58" event={"ID":"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85","Type":"ContainerDied","Data":"2a4f66a85a349e2297061261cf9569aadc0e610e4d937812765468c664dd2f3f"} Dec 01 09:56:59 crc kubenswrapper[4867]: I1201 09:56:59.035300 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vv58" event={"ID":"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85","Type":"ContainerStarted","Data":"30fc8c7f2c3d552dd64bee670b9aa03dd6d8aaf276dd5982baf9a1e61a0c9675"} Dec 01 09:56:59 crc kubenswrapper[4867]: I1201 09:56:59.059709 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5vv58" podStartSLOduration=2.475616438 podStartE2EDuration="6.059687732s" podCreationTimestamp="2025-12-01 09:56:53 +0000 UTC" firstStartedPulling="2025-12-01 09:56:54.992786317 +0000 UTC m=+2936.452173071" lastFinishedPulling="2025-12-01 09:56:58.576857621 +0000 UTC m=+2940.036244365" observedRunningTime="2025-12-01 09:56:59.057218204 +0000 UTC m=+2940.516604968" watchObservedRunningTime="2025-12-01 09:56:59.059687732 +0000 UTC m=+2940.519074486" Dec 01 09:57:03 crc kubenswrapper[4867]: I1201 09:57:03.743182 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5vv58" Dec 01 09:57:03 crc kubenswrapper[4867]: I1201 09:57:03.743766 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5vv58" Dec 01 09:57:03 crc kubenswrapper[4867]: I1201 09:57:03.805563 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5vv58" Dec 01 09:57:04 crc kubenswrapper[4867]: I1201 09:57:04.127968 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5vv58" Dec 01 09:57:04 crc kubenswrapper[4867]: I1201 09:57:04.195533 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5vv58"] Dec 01 09:57:06 crc kubenswrapper[4867]: I1201 09:57:06.097968 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5vv58" podUID="3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85" containerName="registry-server" containerID="cri-o://30fc8c7f2c3d552dd64bee670b9aa03dd6d8aaf276dd5982baf9a1e61a0c9675" gracePeriod=2 Dec 01 09:57:06 crc kubenswrapper[4867]: I1201 09:57:06.836698 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 09:57:06 crc kubenswrapper[4867]: E1201 09:57:06.837258 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.089250 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vv58" Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.108103 4867 generic.go:334] "Generic (PLEG): container finished" podID="3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85" containerID="30fc8c7f2c3d552dd64bee670b9aa03dd6d8aaf276dd5982baf9a1e61a0c9675" exitCode=0 Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.108151 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vv58" event={"ID":"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85","Type":"ContainerDied","Data":"30fc8c7f2c3d552dd64bee670b9aa03dd6d8aaf276dd5982baf9a1e61a0c9675"} Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.108182 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vv58" event={"ID":"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85","Type":"ContainerDied","Data":"b7eb9a3111695b3f9a32a765f59f9f0269b917235da57d0f2b986b52dd4b189c"} Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.108204 4867 scope.go:117] "RemoveContainer" containerID="30fc8c7f2c3d552dd64bee670b9aa03dd6d8aaf276dd5982baf9a1e61a0c9675" Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.108253 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vv58" Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.141078 4867 scope.go:117] "RemoveContainer" containerID="2a4f66a85a349e2297061261cf9569aadc0e610e4d937812765468c664dd2f3f" Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.184067 4867 scope.go:117] "RemoveContainer" containerID="82821539d4fea034c56729a5949d9b73ad7a56c133e1c9acd889b8d34c0036f6" Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.199787 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85-utilities\") pod \"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85\" (UID: \"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85\") " Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.199883 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdc89\" (UniqueName: \"kubernetes.io/projected/3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85-kube-api-access-hdc89\") pod \"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85\" (UID: \"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85\") " Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.200061 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85-catalog-content\") pod \"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85\" (UID: \"3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85\") " Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.201864 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85-utilities" (OuterVolumeSpecName: "utilities") pod "3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85" (UID: "3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.206852 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85-kube-api-access-hdc89" (OuterVolumeSpecName: "kube-api-access-hdc89") pod "3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85" (UID: "3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85"). InnerVolumeSpecName "kube-api-access-hdc89". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.235128 4867 scope.go:117] "RemoveContainer" containerID="30fc8c7f2c3d552dd64bee670b9aa03dd6d8aaf276dd5982baf9a1e61a0c9675" Dec 01 09:57:07 crc kubenswrapper[4867]: E1201 09:57:07.236315 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30fc8c7f2c3d552dd64bee670b9aa03dd6d8aaf276dd5982baf9a1e61a0c9675\": container with ID starting with 30fc8c7f2c3d552dd64bee670b9aa03dd6d8aaf276dd5982baf9a1e61a0c9675 not found: ID does not exist" containerID="30fc8c7f2c3d552dd64bee670b9aa03dd6d8aaf276dd5982baf9a1e61a0c9675" Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.236361 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30fc8c7f2c3d552dd64bee670b9aa03dd6d8aaf276dd5982baf9a1e61a0c9675"} err="failed to get container status \"30fc8c7f2c3d552dd64bee670b9aa03dd6d8aaf276dd5982baf9a1e61a0c9675\": rpc error: code = NotFound desc = could not find container \"30fc8c7f2c3d552dd64bee670b9aa03dd6d8aaf276dd5982baf9a1e61a0c9675\": container with ID starting with 30fc8c7f2c3d552dd64bee670b9aa03dd6d8aaf276dd5982baf9a1e61a0c9675 not found: ID does not exist" Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.236389 4867 scope.go:117] "RemoveContainer" containerID="2a4f66a85a349e2297061261cf9569aadc0e610e4d937812765468c664dd2f3f" Dec 01 09:57:07 crc kubenswrapper[4867]: E1201 09:57:07.237008 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a4f66a85a349e2297061261cf9569aadc0e610e4d937812765468c664dd2f3f\": container with ID starting with 2a4f66a85a349e2297061261cf9569aadc0e610e4d937812765468c664dd2f3f not found: ID does not exist" containerID="2a4f66a85a349e2297061261cf9569aadc0e610e4d937812765468c664dd2f3f" Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.237035 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a4f66a85a349e2297061261cf9569aadc0e610e4d937812765468c664dd2f3f"} err="failed to get container status \"2a4f66a85a349e2297061261cf9569aadc0e610e4d937812765468c664dd2f3f\": rpc error: code = NotFound desc = could not find container \"2a4f66a85a349e2297061261cf9569aadc0e610e4d937812765468c664dd2f3f\": container with ID starting with 2a4f66a85a349e2297061261cf9569aadc0e610e4d937812765468c664dd2f3f not found: ID does not exist" Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.237048 4867 scope.go:117] "RemoveContainer" containerID="82821539d4fea034c56729a5949d9b73ad7a56c133e1c9acd889b8d34c0036f6" Dec 01 09:57:07 crc kubenswrapper[4867]: E1201 09:57:07.237265 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82821539d4fea034c56729a5949d9b73ad7a56c133e1c9acd889b8d34c0036f6\": container with ID starting with 82821539d4fea034c56729a5949d9b73ad7a56c133e1c9acd889b8d34c0036f6 not found: ID does not exist" containerID="82821539d4fea034c56729a5949d9b73ad7a56c133e1c9acd889b8d34c0036f6" Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.237286 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82821539d4fea034c56729a5949d9b73ad7a56c133e1c9acd889b8d34c0036f6"} err="failed to get container status \"82821539d4fea034c56729a5949d9b73ad7a56c133e1c9acd889b8d34c0036f6\": rpc error: code = NotFound desc = could not find container \"82821539d4fea034c56729a5949d9b73ad7a56c133e1c9acd889b8d34c0036f6\": container with ID starting with 82821539d4fea034c56729a5949d9b73ad7a56c133e1c9acd889b8d34c0036f6 not found: ID does not exist" Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.271772 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85" (UID: "3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.302372 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.302410 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdc89\" (UniqueName: \"kubernetes.io/projected/3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85-kube-api-access-hdc89\") on node \"crc\" DevicePath \"\"" Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.302424 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.457934 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5vv58"] Dec 01 09:57:07 crc kubenswrapper[4867]: I1201 09:57:07.469176 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5vv58"] Dec 01 09:57:08 crc kubenswrapper[4867]: I1201 09:57:08.840370 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85" path="/var/lib/kubelet/pods/3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85/volumes" Dec 01 09:57:19 crc kubenswrapper[4867]: I1201 09:57:19.827537 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 09:57:19 crc kubenswrapper[4867]: E1201 09:57:19.828389 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:57:31 crc kubenswrapper[4867]: I1201 09:57:31.827860 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 09:57:31 crc kubenswrapper[4867]: E1201 09:57:31.829730 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:57:46 crc kubenswrapper[4867]: I1201 09:57:46.827336 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 09:57:46 crc kubenswrapper[4867]: E1201 09:57:46.828191 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:57:57 crc kubenswrapper[4867]: I1201 09:57:57.827700 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 09:57:57 crc kubenswrapper[4867]: E1201 09:57:57.828565 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:58:10 crc kubenswrapper[4867]: I1201 09:58:10.827291 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 09:58:10 crc kubenswrapper[4867]: E1201 09:58:10.828067 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:58:23 crc kubenswrapper[4867]: I1201 09:58:23.826581 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 09:58:23 crc kubenswrapper[4867]: E1201 09:58:23.827424 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:58:37 crc kubenswrapper[4867]: I1201 09:58:37.827120 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 09:58:37 crc kubenswrapper[4867]: E1201 09:58:37.827925 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:58:49 crc kubenswrapper[4867]: I1201 09:58:49.827294 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 09:58:49 crc kubenswrapper[4867]: E1201 09:58:49.828016 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:59:02 crc kubenswrapper[4867]: I1201 09:59:02.827320 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 09:59:02 crc kubenswrapper[4867]: E1201 09:59:02.828157 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:59:16 crc kubenswrapper[4867]: I1201 09:59:16.828041 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 09:59:16 crc kubenswrapper[4867]: E1201 09:59:16.828842 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:59:28 crc kubenswrapper[4867]: I1201 09:59:28.836704 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 09:59:28 crc kubenswrapper[4867]: E1201 09:59:28.838960 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:59:41 crc kubenswrapper[4867]: I1201 09:59:41.429308 4867 generic.go:334] "Generic (PLEG): container finished" podID="8a874825-a4d4-446d-b1fe-3317e3b67d55" containerID="71c314410504cf6030dd0ad473edbdf3df938f72e0891427e6983eb796129c28" exitCode=0 Dec 01 09:59:41 crc kubenswrapper[4867]: I1201 09:59:41.429382 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" event={"ID":"8a874825-a4d4-446d-b1fe-3317e3b67d55","Type":"ContainerDied","Data":"71c314410504cf6030dd0ad473edbdf3df938f72e0891427e6983eb796129c28"} Dec 01 09:59:41 crc kubenswrapper[4867]: I1201 09:59:41.827388 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 09:59:41 crc kubenswrapper[4867]: E1201 09:59:41.827768 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.048460 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.194362 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4hbk\" (UniqueName: \"kubernetes.io/projected/8a874825-a4d4-446d-b1fe-3317e3b67d55-kube-api-access-j4hbk\") pod \"8a874825-a4d4-446d-b1fe-3317e3b67d55\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.194706 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ceilometer-compute-config-data-0\") pod \"8a874825-a4d4-446d-b1fe-3317e3b67d55\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.194940 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ssh-key\") pod \"8a874825-a4d4-446d-b1fe-3317e3b67d55\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.194966 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-telemetry-combined-ca-bundle\") pod \"8a874825-a4d4-446d-b1fe-3317e3b67d55\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.195007 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ceilometer-compute-config-data-2\") pod \"8a874825-a4d4-446d-b1fe-3317e3b67d55\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.195040 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-inventory\") pod \"8a874825-a4d4-446d-b1fe-3317e3b67d55\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.195084 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ceilometer-compute-config-data-1\") pod \"8a874825-a4d4-446d-b1fe-3317e3b67d55\" (UID: \"8a874825-a4d4-446d-b1fe-3317e3b67d55\") " Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.202123 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a874825-a4d4-446d-b1fe-3317e3b67d55-kube-api-access-j4hbk" (OuterVolumeSpecName: "kube-api-access-j4hbk") pod "8a874825-a4d4-446d-b1fe-3317e3b67d55" (UID: "8a874825-a4d4-446d-b1fe-3317e3b67d55"). InnerVolumeSpecName "kube-api-access-j4hbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.208158 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8a874825-a4d4-446d-b1fe-3317e3b67d55" (UID: "8a874825-a4d4-446d-b1fe-3317e3b67d55"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.232615 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "8a874825-a4d4-446d-b1fe-3317e3b67d55" (UID: "8a874825-a4d4-446d-b1fe-3317e3b67d55"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.233020 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8a874825-a4d4-446d-b1fe-3317e3b67d55" (UID: "8a874825-a4d4-446d-b1fe-3317e3b67d55"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.242710 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "8a874825-a4d4-446d-b1fe-3317e3b67d55" (UID: "8a874825-a4d4-446d-b1fe-3317e3b67d55"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.277994 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-inventory" (OuterVolumeSpecName: "inventory") pod "8a874825-a4d4-446d-b1fe-3317e3b67d55" (UID: "8a874825-a4d4-446d-b1fe-3317e3b67d55"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.279864 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "8a874825-a4d4-446d-b1fe-3317e3b67d55" (UID: "8a874825-a4d4-446d-b1fe-3317e3b67d55"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.297262 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.297504 4867 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.297567 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4hbk\" (UniqueName: \"kubernetes.io/projected/8a874825-a4d4-446d-b1fe-3317e3b67d55-kube-api-access-j4hbk\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.297620 4867 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.297890 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.297946 4867 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.297997 4867 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8a874825-a4d4-446d-b1fe-3317e3b67d55-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.451527 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" event={"ID":"8a874825-a4d4-446d-b1fe-3317e3b67d55","Type":"ContainerDied","Data":"8bca8f243742b293fa9b9f28735e1e2b1652c0de268b6caef81402c239d4f58c"} Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.451572 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bca8f243742b293fa9b9f28735e1e2b1652c0de268b6caef81402c239d4f58c" Dec 01 09:59:43 crc kubenswrapper[4867]: I1201 09:59:43.451594 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8df6" Dec 01 09:59:54 crc kubenswrapper[4867]: I1201 09:59:54.832020 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 09:59:54 crc kubenswrapper[4867]: E1201 09:59:54.833373 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.151097 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b"] Dec 01 10:00:00 crc kubenswrapper[4867]: E1201 10:00:00.152014 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a874825-a4d4-446d-b1fe-3317e3b67d55" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.152027 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a874825-a4d4-446d-b1fe-3317e3b67d55" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 01 10:00:00 crc kubenswrapper[4867]: E1201 10:00:00.152060 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85" containerName="extract-utilities" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.152065 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85" containerName="extract-utilities" Dec 01 10:00:00 crc kubenswrapper[4867]: E1201 10:00:00.152075 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85" containerName="extract-content" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.152082 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85" containerName="extract-content" Dec 01 10:00:00 crc kubenswrapper[4867]: E1201 10:00:00.152100 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85" containerName="registry-server" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.152106 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85" containerName="registry-server" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.152281 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c97be5d-8c64-4a3d-ac1d-48f8b36e8b85" containerName="registry-server" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.152294 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a874825-a4d4-446d-b1fe-3317e3b67d55" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.152921 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.154754 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.155137 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.180827 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b"] Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.251411 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b13efbea-f333-4f57-8c7a-814104fcd7f5-secret-volume\") pod \"collect-profiles-29409720-qxf5b\" (UID: \"b13efbea-f333-4f57-8c7a-814104fcd7f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.251576 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b13efbea-f333-4f57-8c7a-814104fcd7f5-config-volume\") pod \"collect-profiles-29409720-qxf5b\" (UID: \"b13efbea-f333-4f57-8c7a-814104fcd7f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.251632 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qr6p\" (UniqueName: \"kubernetes.io/projected/b13efbea-f333-4f57-8c7a-814104fcd7f5-kube-api-access-2qr6p\") pod \"collect-profiles-29409720-qxf5b\" (UID: \"b13efbea-f333-4f57-8c7a-814104fcd7f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.353323 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b13efbea-f333-4f57-8c7a-814104fcd7f5-config-volume\") pod \"collect-profiles-29409720-qxf5b\" (UID: \"b13efbea-f333-4f57-8c7a-814104fcd7f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.353397 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qr6p\" (UniqueName: \"kubernetes.io/projected/b13efbea-f333-4f57-8c7a-814104fcd7f5-kube-api-access-2qr6p\") pod \"collect-profiles-29409720-qxf5b\" (UID: \"b13efbea-f333-4f57-8c7a-814104fcd7f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.353492 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b13efbea-f333-4f57-8c7a-814104fcd7f5-secret-volume\") pod \"collect-profiles-29409720-qxf5b\" (UID: \"b13efbea-f333-4f57-8c7a-814104fcd7f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.354333 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b13efbea-f333-4f57-8c7a-814104fcd7f5-config-volume\") pod \"collect-profiles-29409720-qxf5b\" (UID: \"b13efbea-f333-4f57-8c7a-814104fcd7f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.369542 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b13efbea-f333-4f57-8c7a-814104fcd7f5-secret-volume\") pod \"collect-profiles-29409720-qxf5b\" (UID: \"b13efbea-f333-4f57-8c7a-814104fcd7f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.376442 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qr6p\" (UniqueName: \"kubernetes.io/projected/b13efbea-f333-4f57-8c7a-814104fcd7f5-kube-api-access-2qr6p\") pod \"collect-profiles-29409720-qxf5b\" (UID: \"b13efbea-f333-4f57-8c7a-814104fcd7f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.483493 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b" Dec 01 10:00:00 crc kubenswrapper[4867]: I1201 10:00:00.974570 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b"] Dec 01 10:00:01 crc kubenswrapper[4867]: I1201 10:00:01.612261 4867 generic.go:334] "Generic (PLEG): container finished" podID="b13efbea-f333-4f57-8c7a-814104fcd7f5" containerID="ace37c939462124d9bbd7f93c6ae765c3739ca31aa3e77629716d619e5f31f4d" exitCode=0 Dec 01 10:00:01 crc kubenswrapper[4867]: I1201 10:00:01.612307 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b" event={"ID":"b13efbea-f333-4f57-8c7a-814104fcd7f5","Type":"ContainerDied","Data":"ace37c939462124d9bbd7f93c6ae765c3739ca31aa3e77629716d619e5f31f4d"} Dec 01 10:00:01 crc kubenswrapper[4867]: I1201 10:00:01.612336 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b" event={"ID":"b13efbea-f333-4f57-8c7a-814104fcd7f5","Type":"ContainerStarted","Data":"1835aef097aa6c888fb48d9699e58d536eee8aedac6994164432e3f67c24f810"} Dec 01 10:00:02 crc kubenswrapper[4867]: I1201 10:00:02.946474 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b" Dec 01 10:00:03 crc kubenswrapper[4867]: I1201 10:00:03.103385 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b13efbea-f333-4f57-8c7a-814104fcd7f5-secret-volume\") pod \"b13efbea-f333-4f57-8c7a-814104fcd7f5\" (UID: \"b13efbea-f333-4f57-8c7a-814104fcd7f5\") " Dec 01 10:00:03 crc kubenswrapper[4867]: I1201 10:00:03.103720 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qr6p\" (UniqueName: \"kubernetes.io/projected/b13efbea-f333-4f57-8c7a-814104fcd7f5-kube-api-access-2qr6p\") pod \"b13efbea-f333-4f57-8c7a-814104fcd7f5\" (UID: \"b13efbea-f333-4f57-8c7a-814104fcd7f5\") " Dec 01 10:00:03 crc kubenswrapper[4867]: I1201 10:00:03.103753 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b13efbea-f333-4f57-8c7a-814104fcd7f5-config-volume\") pod \"b13efbea-f333-4f57-8c7a-814104fcd7f5\" (UID: \"b13efbea-f333-4f57-8c7a-814104fcd7f5\") " Dec 01 10:00:03 crc kubenswrapper[4867]: I1201 10:00:03.106410 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13efbea-f333-4f57-8c7a-814104fcd7f5-config-volume" (OuterVolumeSpecName: "config-volume") pod "b13efbea-f333-4f57-8c7a-814104fcd7f5" (UID: "b13efbea-f333-4f57-8c7a-814104fcd7f5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:00:03 crc kubenswrapper[4867]: I1201 10:00:03.114048 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b13efbea-f333-4f57-8c7a-814104fcd7f5-kube-api-access-2qr6p" (OuterVolumeSpecName: "kube-api-access-2qr6p") pod "b13efbea-f333-4f57-8c7a-814104fcd7f5" (UID: "b13efbea-f333-4f57-8c7a-814104fcd7f5"). InnerVolumeSpecName "kube-api-access-2qr6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:00:03 crc kubenswrapper[4867]: I1201 10:00:03.115992 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b13efbea-f333-4f57-8c7a-814104fcd7f5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b13efbea-f333-4f57-8c7a-814104fcd7f5" (UID: "b13efbea-f333-4f57-8c7a-814104fcd7f5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:00:03 crc kubenswrapper[4867]: I1201 10:00:03.206286 4867 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b13efbea-f333-4f57-8c7a-814104fcd7f5-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:03 crc kubenswrapper[4867]: I1201 10:00:03.206337 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qr6p\" (UniqueName: \"kubernetes.io/projected/b13efbea-f333-4f57-8c7a-814104fcd7f5-kube-api-access-2qr6p\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:03 crc kubenswrapper[4867]: I1201 10:00:03.206354 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b13efbea-f333-4f57-8c7a-814104fcd7f5-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:03 crc kubenswrapper[4867]: I1201 10:00:03.630592 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b" event={"ID":"b13efbea-f333-4f57-8c7a-814104fcd7f5","Type":"ContainerDied","Data":"1835aef097aa6c888fb48d9699e58d536eee8aedac6994164432e3f67c24f810"} Dec 01 10:00:03 crc kubenswrapper[4867]: I1201 10:00:03.630634 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1835aef097aa6c888fb48d9699e58d536eee8aedac6994164432e3f67c24f810" Dec 01 10:00:03 crc kubenswrapper[4867]: I1201 10:00:03.630648 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b" Dec 01 10:00:04 crc kubenswrapper[4867]: I1201 10:00:04.026866 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz"] Dec 01 10:00:04 crc kubenswrapper[4867]: I1201 10:00:04.034552 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409675-r5lsz"] Dec 01 10:00:04 crc kubenswrapper[4867]: I1201 10:00:04.843873 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="658aa664-9092-421c-ab73-d7a75baff7f4" path="/var/lib/kubelet/pods/658aa664-9092-421c-ab73-d7a75baff7f4/volumes" Dec 01 10:00:08 crc kubenswrapper[4867]: I1201 10:00:08.835376 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 10:00:08 crc kubenswrapper[4867]: E1201 10:00:08.836113 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:00:20 crc kubenswrapper[4867]: I1201 10:00:20.831968 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 10:00:20 crc kubenswrapper[4867]: E1201 10:00:20.832686 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:00:21 crc kubenswrapper[4867]: I1201 10:00:21.795682 4867 scope.go:117] "RemoveContainer" containerID="dc4f853ce1ec07b75ab1469fad3535e1a1f1f34b42b8e2485f306b5224b396f1" Dec 01 10:00:31 crc kubenswrapper[4867]: I1201 10:00:31.826350 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 10:00:31 crc kubenswrapper[4867]: E1201 10:00:31.826972 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:00:43 crc kubenswrapper[4867]: I1201 10:00:43.827785 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 10:00:43 crc kubenswrapper[4867]: E1201 10:00:43.828833 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:00:48 crc kubenswrapper[4867]: I1201 10:00:48.931649 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 10:00:48 crc kubenswrapper[4867]: E1201 10:00:48.933011 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b13efbea-f333-4f57-8c7a-814104fcd7f5" containerName="collect-profiles" Dec 01 10:00:48 crc kubenswrapper[4867]: I1201 10:00:48.933041 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b13efbea-f333-4f57-8c7a-814104fcd7f5" containerName="collect-profiles" Dec 01 10:00:48 crc kubenswrapper[4867]: I1201 10:00:48.933381 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b13efbea-f333-4f57-8c7a-814104fcd7f5" containerName="collect-profiles" Dec 01 10:00:48 crc kubenswrapper[4867]: I1201 10:00:48.934364 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 10:00:48 crc kubenswrapper[4867]: I1201 10:00:48.938010 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 01 10:00:48 crc kubenswrapper[4867]: I1201 10:00:48.938095 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 01 10:00:48 crc kubenswrapper[4867]: I1201 10:00:48.938168 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 01 10:00:48 crc kubenswrapper[4867]: I1201 10:00:48.938595 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-77hwz" Dec 01 10:00:48 crc kubenswrapper[4867]: I1201 10:00:48.945582 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 10:00:48 crc kubenswrapper[4867]: I1201 10:00:48.950254 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/31b3d747-c383-483d-8919-be1dd3a266b6-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:48 crc kubenswrapper[4867]: I1201 10:00:48.950298 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31b3d747-c383-483d-8919-be1dd3a266b6-config-data\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:48 crc kubenswrapper[4867]: I1201 10:00:48.950373 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/31b3d747-c383-483d-8919-be1dd3a266b6-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.052154 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/31b3d747-c383-483d-8919-be1dd3a266b6-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.052768 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31b3d747-c383-483d-8919-be1dd3a266b6-config-data\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.052988 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/31b3d747-c383-483d-8919-be1dd3a266b6-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.053157 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/31b3d747-c383-483d-8919-be1dd3a266b6-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.053369 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/31b3d747-c383-483d-8919-be1dd3a266b6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.053520 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt9vj\" (UniqueName: \"kubernetes.io/projected/31b3d747-c383-483d-8919-be1dd3a266b6-kube-api-access-xt9vj\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.053642 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.053765 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/31b3d747-c383-483d-8919-be1dd3a266b6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.053938 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31b3d747-c383-483d-8919-be1dd3a266b6-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.054025 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/31b3d747-c383-483d-8919-be1dd3a266b6-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.054602 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31b3d747-c383-483d-8919-be1dd3a266b6-config-data\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.069143 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/31b3d747-c383-483d-8919-be1dd3a266b6-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.157232 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/31b3d747-c383-483d-8919-be1dd3a266b6-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.158479 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/31b3d747-c383-483d-8919-be1dd3a266b6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.158518 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt9vj\" (UniqueName: \"kubernetes.io/projected/31b3d747-c383-483d-8919-be1dd3a266b6-kube-api-access-xt9vj\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.158550 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.158579 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/31b3d747-c383-483d-8919-be1dd3a266b6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.158602 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31b3d747-c383-483d-8919-be1dd3a266b6-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.158891 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/31b3d747-c383-483d-8919-be1dd3a266b6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.158995 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.159007 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/31b3d747-c383-483d-8919-be1dd3a266b6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.174849 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31b3d747-c383-483d-8919-be1dd3a266b6-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.174976 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/31b3d747-c383-483d-8919-be1dd3a266b6-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.178913 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt9vj\" (UniqueName: \"kubernetes.io/projected/31b3d747-c383-483d-8919-be1dd3a266b6-kube-api-access-xt9vj\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.195725 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.298127 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.763617 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 10:00:49 crc kubenswrapper[4867]: I1201 10:00:49.766449 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:00:50 crc kubenswrapper[4867]: I1201 10:00:50.134638 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"31b3d747-c383-483d-8919-be1dd3a266b6","Type":"ContainerStarted","Data":"bb545976072e04bbb374e034523593a4164c975cbe7afcf61c472ba7282403d7"} Dec 01 10:00:58 crc kubenswrapper[4867]: I1201 10:00:58.835751 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 10:00:58 crc kubenswrapper[4867]: E1201 10:00:58.836768 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:01:00 crc kubenswrapper[4867]: I1201 10:01:00.138533 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29409721-k9xqv"] Dec 01 10:01:00 crc kubenswrapper[4867]: I1201 10:01:00.140563 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409721-k9xqv" Dec 01 10:01:00 crc kubenswrapper[4867]: I1201 10:01:00.161049 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29409721-k9xqv"] Dec 01 10:01:00 crc kubenswrapper[4867]: I1201 10:01:00.285257 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-combined-ca-bundle\") pod \"keystone-cron-29409721-k9xqv\" (UID: \"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef\") " pod="openstack/keystone-cron-29409721-k9xqv" Dec 01 10:01:00 crc kubenswrapper[4867]: I1201 10:01:00.285332 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-config-data\") pod \"keystone-cron-29409721-k9xqv\" (UID: \"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef\") " pod="openstack/keystone-cron-29409721-k9xqv" Dec 01 10:01:00 crc kubenswrapper[4867]: I1201 10:01:00.285381 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-fernet-keys\") pod \"keystone-cron-29409721-k9xqv\" (UID: \"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef\") " pod="openstack/keystone-cron-29409721-k9xqv" Dec 01 10:01:00 crc kubenswrapper[4867]: I1201 10:01:00.285597 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g52tw\" (UniqueName: \"kubernetes.io/projected/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-kube-api-access-g52tw\") pod \"keystone-cron-29409721-k9xqv\" (UID: \"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef\") " pod="openstack/keystone-cron-29409721-k9xqv" Dec 01 10:01:00 crc kubenswrapper[4867]: I1201 10:01:00.387727 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g52tw\" (UniqueName: \"kubernetes.io/projected/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-kube-api-access-g52tw\") pod \"keystone-cron-29409721-k9xqv\" (UID: \"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef\") " pod="openstack/keystone-cron-29409721-k9xqv" Dec 01 10:01:00 crc kubenswrapper[4867]: I1201 10:01:00.387937 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-combined-ca-bundle\") pod \"keystone-cron-29409721-k9xqv\" (UID: \"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef\") " pod="openstack/keystone-cron-29409721-k9xqv" Dec 01 10:01:00 crc kubenswrapper[4867]: I1201 10:01:00.388818 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-config-data\") pod \"keystone-cron-29409721-k9xqv\" (UID: \"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef\") " pod="openstack/keystone-cron-29409721-k9xqv" Dec 01 10:01:00 crc kubenswrapper[4867]: I1201 10:01:00.388893 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-fernet-keys\") pod \"keystone-cron-29409721-k9xqv\" (UID: \"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef\") " pod="openstack/keystone-cron-29409721-k9xqv" Dec 01 10:01:00 crc kubenswrapper[4867]: I1201 10:01:00.468999 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g52tw\" (UniqueName: \"kubernetes.io/projected/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-kube-api-access-g52tw\") pod \"keystone-cron-29409721-k9xqv\" (UID: \"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef\") " pod="openstack/keystone-cron-29409721-k9xqv" Dec 01 10:01:00 crc kubenswrapper[4867]: I1201 10:01:00.470702 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-fernet-keys\") pod \"keystone-cron-29409721-k9xqv\" (UID: \"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef\") " pod="openstack/keystone-cron-29409721-k9xqv" Dec 01 10:01:00 crc kubenswrapper[4867]: I1201 10:01:00.474894 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-combined-ca-bundle\") pod \"keystone-cron-29409721-k9xqv\" (UID: \"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef\") " pod="openstack/keystone-cron-29409721-k9xqv" Dec 01 10:01:00 crc kubenswrapper[4867]: I1201 10:01:00.486575 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-config-data\") pod \"keystone-cron-29409721-k9xqv\" (UID: \"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef\") " pod="openstack/keystone-cron-29409721-k9xqv" Dec 01 10:01:00 crc kubenswrapper[4867]: I1201 10:01:00.765299 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409721-k9xqv" Dec 01 10:01:01 crc kubenswrapper[4867]: I1201 10:01:01.446252 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29409721-k9xqv"] Dec 01 10:01:02 crc kubenswrapper[4867]: I1201 10:01:02.247063 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409721-k9xqv" event={"ID":"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef","Type":"ContainerStarted","Data":"ecd8177475022ceeeb1dc5cc64155511bf0ce91a08f92eb2b83e308237679ee9"} Dec 01 10:01:02 crc kubenswrapper[4867]: I1201 10:01:02.247609 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409721-k9xqv" event={"ID":"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef","Type":"ContainerStarted","Data":"7dffd58b7933423204a0bb7e4d8a4166729a6b734adc013fdfe987967abb60f5"} Dec 01 10:01:02 crc kubenswrapper[4867]: I1201 10:01:02.263435 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29409721-k9xqv" podStartSLOduration=2.263413636 podStartE2EDuration="2.263413636s" podCreationTimestamp="2025-12-01 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:02.26280236 +0000 UTC m=+3183.722189134" watchObservedRunningTime="2025-12-01 10:01:02.263413636 +0000 UTC m=+3183.722800390" Dec 01 10:01:05 crc kubenswrapper[4867]: I1201 10:01:05.276639 4867 generic.go:334] "Generic (PLEG): container finished" podID="9a2a65a7-bbfb-40ee-bfe2-f99d1173daef" containerID="ecd8177475022ceeeb1dc5cc64155511bf0ce91a08f92eb2b83e308237679ee9" exitCode=0 Dec 01 10:01:05 crc kubenswrapper[4867]: I1201 10:01:05.276725 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409721-k9xqv" event={"ID":"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef","Type":"ContainerDied","Data":"ecd8177475022ceeeb1dc5cc64155511bf0ce91a08f92eb2b83e308237679ee9"} Dec 01 10:01:11 crc kubenswrapper[4867]: I1201 10:01:11.827679 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 10:01:11 crc kubenswrapper[4867]: E1201 10:01:11.828511 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:01:25 crc kubenswrapper[4867]: I1201 10:01:25.827402 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 10:01:25 crc kubenswrapper[4867]: E1201 10:01:25.829653 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:01:30 crc kubenswrapper[4867]: I1201 10:01:30.521210 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409721-k9xqv" event={"ID":"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef","Type":"ContainerDied","Data":"7dffd58b7933423204a0bb7e4d8a4166729a6b734adc013fdfe987967abb60f5"} Dec 01 10:01:30 crc kubenswrapper[4867]: I1201 10:01:30.521629 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dffd58b7933423204a0bb7e4d8a4166729a6b734adc013fdfe987967abb60f5" Dec 01 10:01:30 crc kubenswrapper[4867]: E1201 10:01:30.535732 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 01 10:01:30 crc kubenswrapper[4867]: E1201 10:01:30.537216 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xt9vj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(31b3d747-c383-483d-8919-be1dd3a266b6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:01:30 crc kubenswrapper[4867]: E1201 10:01:30.538791 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="31b3d747-c383-483d-8919-be1dd3a266b6" Dec 01 10:01:30 crc kubenswrapper[4867]: I1201 10:01:30.568261 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409721-k9xqv" Dec 01 10:01:30 crc kubenswrapper[4867]: I1201 10:01:30.665645 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g52tw\" (UniqueName: \"kubernetes.io/projected/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-kube-api-access-g52tw\") pod \"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef\" (UID: \"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef\") " Dec 01 10:01:30 crc kubenswrapper[4867]: I1201 10:01:30.665772 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-config-data\") pod \"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef\" (UID: \"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef\") " Dec 01 10:01:30 crc kubenswrapper[4867]: I1201 10:01:30.665803 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-fernet-keys\") pod \"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef\" (UID: \"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef\") " Dec 01 10:01:30 crc kubenswrapper[4867]: I1201 10:01:30.665911 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-combined-ca-bundle\") pod \"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef\" (UID: \"9a2a65a7-bbfb-40ee-bfe2-f99d1173daef\") " Dec 01 10:01:30 crc kubenswrapper[4867]: I1201 10:01:30.673318 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9a2a65a7-bbfb-40ee-bfe2-f99d1173daef" (UID: "9a2a65a7-bbfb-40ee-bfe2-f99d1173daef"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:01:30 crc kubenswrapper[4867]: I1201 10:01:30.673343 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-kube-api-access-g52tw" (OuterVolumeSpecName: "kube-api-access-g52tw") pod "9a2a65a7-bbfb-40ee-bfe2-f99d1173daef" (UID: "9a2a65a7-bbfb-40ee-bfe2-f99d1173daef"). InnerVolumeSpecName "kube-api-access-g52tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:01:30 crc kubenswrapper[4867]: I1201 10:01:30.706715 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a2a65a7-bbfb-40ee-bfe2-f99d1173daef" (UID: "9a2a65a7-bbfb-40ee-bfe2-f99d1173daef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:01:30 crc kubenswrapper[4867]: I1201 10:01:30.732862 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-config-data" (OuterVolumeSpecName: "config-data") pod "9a2a65a7-bbfb-40ee-bfe2-f99d1173daef" (UID: "9a2a65a7-bbfb-40ee-bfe2-f99d1173daef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:01:30 crc kubenswrapper[4867]: I1201 10:01:30.768801 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g52tw\" (UniqueName: \"kubernetes.io/projected/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-kube-api-access-g52tw\") on node \"crc\" DevicePath \"\"" Dec 01 10:01:30 crc kubenswrapper[4867]: I1201 10:01:30.768857 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:01:30 crc kubenswrapper[4867]: I1201 10:01:30.768866 4867 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 10:01:30 crc kubenswrapper[4867]: I1201 10:01:30.768875 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a2a65a7-bbfb-40ee-bfe2-f99d1173daef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:01:31 crc kubenswrapper[4867]: I1201 10:01:31.529552 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409721-k9xqv" Dec 01 10:01:31 crc kubenswrapper[4867]: E1201 10:01:31.531716 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="31b3d747-c383-483d-8919-be1dd3a266b6" Dec 01 10:01:40 crc kubenswrapper[4867]: I1201 10:01:40.830219 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 10:01:40 crc kubenswrapper[4867]: E1201 10:01:40.832432 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:01:45 crc kubenswrapper[4867]: I1201 10:01:45.356852 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 01 10:01:47 crc kubenswrapper[4867]: I1201 10:01:47.677133 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"31b3d747-c383-483d-8919-be1dd3a266b6","Type":"ContainerStarted","Data":"f948a94cd1b99d4df29f081e26ec442a368417e18fba05cb3ed8073b8018f431"} Dec 01 10:01:47 crc kubenswrapper[4867]: I1201 10:01:47.707032 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=5.118888507 podStartE2EDuration="1m0.70701474s" podCreationTimestamp="2025-12-01 10:00:47 +0000 UTC" firstStartedPulling="2025-12-01 10:00:49.766198705 +0000 UTC m=+3171.225585459" lastFinishedPulling="2025-12-01 10:01:45.354324938 +0000 UTC m=+3226.813711692" observedRunningTime="2025-12-01 10:01:47.699219786 +0000 UTC m=+3229.158606570" watchObservedRunningTime="2025-12-01 10:01:47.70701474 +0000 UTC m=+3229.166401484" Dec 01 10:01:52 crc kubenswrapper[4867]: I1201 10:01:52.827733 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 10:01:53 crc kubenswrapper[4867]: I1201 10:01:53.730458 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"92327f65a1be78e5e4a126caf146bbc31afd87bb225d5a46125eb70cdf8d833b"} Dec 01 10:04:15 crc kubenswrapper[4867]: I1201 10:04:15.956644 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4rkdq"] Dec 01 10:04:15 crc kubenswrapper[4867]: E1201 10:04:15.958035 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a2a65a7-bbfb-40ee-bfe2-f99d1173daef" containerName="keystone-cron" Dec 01 10:04:15 crc kubenswrapper[4867]: I1201 10:04:15.958066 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a2a65a7-bbfb-40ee-bfe2-f99d1173daef" containerName="keystone-cron" Dec 01 10:04:15 crc kubenswrapper[4867]: I1201 10:04:15.958409 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a2a65a7-bbfb-40ee-bfe2-f99d1173daef" containerName="keystone-cron" Dec 01 10:04:15 crc kubenswrapper[4867]: I1201 10:04:15.961139 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rkdq" Dec 01 10:04:15 crc kubenswrapper[4867]: I1201 10:04:15.992500 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4rkdq"] Dec 01 10:04:16 crc kubenswrapper[4867]: I1201 10:04:16.083422 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593fd346-565a-4e98-a9ca-2a641ee1fe09-utilities\") pod \"certified-operators-4rkdq\" (UID: \"593fd346-565a-4e98-a9ca-2a641ee1fe09\") " pod="openshift-marketplace/certified-operators-4rkdq" Dec 01 10:04:16 crc kubenswrapper[4867]: I1201 10:04:16.083503 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593fd346-565a-4e98-a9ca-2a641ee1fe09-catalog-content\") pod \"certified-operators-4rkdq\" (UID: \"593fd346-565a-4e98-a9ca-2a641ee1fe09\") " pod="openshift-marketplace/certified-operators-4rkdq" Dec 01 10:04:16 crc kubenswrapper[4867]: I1201 10:04:16.083582 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7fwc\" (UniqueName: \"kubernetes.io/projected/593fd346-565a-4e98-a9ca-2a641ee1fe09-kube-api-access-v7fwc\") pod \"certified-operators-4rkdq\" (UID: \"593fd346-565a-4e98-a9ca-2a641ee1fe09\") " pod="openshift-marketplace/certified-operators-4rkdq" Dec 01 10:04:16 crc kubenswrapper[4867]: I1201 10:04:16.185747 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7fwc\" (UniqueName: \"kubernetes.io/projected/593fd346-565a-4e98-a9ca-2a641ee1fe09-kube-api-access-v7fwc\") pod \"certified-operators-4rkdq\" (UID: \"593fd346-565a-4e98-a9ca-2a641ee1fe09\") " pod="openshift-marketplace/certified-operators-4rkdq" Dec 01 10:04:16 crc kubenswrapper[4867]: I1201 10:04:16.186191 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593fd346-565a-4e98-a9ca-2a641ee1fe09-utilities\") pod \"certified-operators-4rkdq\" (UID: \"593fd346-565a-4e98-a9ca-2a641ee1fe09\") " pod="openshift-marketplace/certified-operators-4rkdq" Dec 01 10:04:16 crc kubenswrapper[4867]: I1201 10:04:16.186259 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593fd346-565a-4e98-a9ca-2a641ee1fe09-catalog-content\") pod \"certified-operators-4rkdq\" (UID: \"593fd346-565a-4e98-a9ca-2a641ee1fe09\") " pod="openshift-marketplace/certified-operators-4rkdq" Dec 01 10:04:16 crc kubenswrapper[4867]: I1201 10:04:16.187057 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593fd346-565a-4e98-a9ca-2a641ee1fe09-catalog-content\") pod \"certified-operators-4rkdq\" (UID: \"593fd346-565a-4e98-a9ca-2a641ee1fe09\") " pod="openshift-marketplace/certified-operators-4rkdq" Dec 01 10:04:16 crc kubenswrapper[4867]: I1201 10:04:16.187111 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593fd346-565a-4e98-a9ca-2a641ee1fe09-utilities\") pod \"certified-operators-4rkdq\" (UID: \"593fd346-565a-4e98-a9ca-2a641ee1fe09\") " pod="openshift-marketplace/certified-operators-4rkdq" Dec 01 10:04:16 crc kubenswrapper[4867]: I1201 10:04:16.224102 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7fwc\" (UniqueName: \"kubernetes.io/projected/593fd346-565a-4e98-a9ca-2a641ee1fe09-kube-api-access-v7fwc\") pod \"certified-operators-4rkdq\" (UID: \"593fd346-565a-4e98-a9ca-2a641ee1fe09\") " pod="openshift-marketplace/certified-operators-4rkdq" Dec 01 10:04:16 crc kubenswrapper[4867]: I1201 10:04:16.310084 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rkdq" Dec 01 10:04:17 crc kubenswrapper[4867]: I1201 10:04:17.823411 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4rkdq"] Dec 01 10:04:18 crc kubenswrapper[4867]: I1201 10:04:18.081674 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rkdq" event={"ID":"593fd346-565a-4e98-a9ca-2a641ee1fe09","Type":"ContainerStarted","Data":"1e8da87588e4cab8e08ac6d48f3a8a16a511fd23a09dd49061103fec45ce8e76"} Dec 01 10:04:19 crc kubenswrapper[4867]: I1201 10:04:19.095384 4867 generic.go:334] "Generic (PLEG): container finished" podID="593fd346-565a-4e98-a9ca-2a641ee1fe09" containerID="9e9d880fb396ef0352e8c7578908e4ac2a2e6b2e970148477aa797ef5ad3c548" exitCode=0 Dec 01 10:04:19 crc kubenswrapper[4867]: I1201 10:04:19.095733 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rkdq" event={"ID":"593fd346-565a-4e98-a9ca-2a641ee1fe09","Type":"ContainerDied","Data":"9e9d880fb396ef0352e8c7578908e4ac2a2e6b2e970148477aa797ef5ad3c548"} Dec 01 10:04:21 crc kubenswrapper[4867]: I1201 10:04:21.115345 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rkdq" event={"ID":"593fd346-565a-4e98-a9ca-2a641ee1fe09","Type":"ContainerStarted","Data":"e0ea569f44c0a7f84f51927e5c5550a9efe88a29adf606830ac181f8380c6fe1"} Dec 01 10:04:21 crc kubenswrapper[4867]: I1201 10:04:21.601800 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:04:21 crc kubenswrapper[4867]: I1201 10:04:21.601884 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:04:22 crc kubenswrapper[4867]: I1201 10:04:22.157769 4867 generic.go:334] "Generic (PLEG): container finished" podID="593fd346-565a-4e98-a9ca-2a641ee1fe09" containerID="e0ea569f44c0a7f84f51927e5c5550a9efe88a29adf606830ac181f8380c6fe1" exitCode=0 Dec 01 10:04:22 crc kubenswrapper[4867]: I1201 10:04:22.157947 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rkdq" event={"ID":"593fd346-565a-4e98-a9ca-2a641ee1fe09","Type":"ContainerDied","Data":"e0ea569f44c0a7f84f51927e5c5550a9efe88a29adf606830ac181f8380c6fe1"} Dec 01 10:04:23 crc kubenswrapper[4867]: I1201 10:04:23.182181 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rkdq" event={"ID":"593fd346-565a-4e98-a9ca-2a641ee1fe09","Type":"ContainerStarted","Data":"c97b68720725c3a6615b085903684bea93243c3a35212919f85ba8e116085751"} Dec 01 10:04:23 crc kubenswrapper[4867]: I1201 10:04:23.218982 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4rkdq" podStartSLOduration=4.452886036 podStartE2EDuration="8.218959218s" podCreationTimestamp="2025-12-01 10:04:15 +0000 UTC" firstStartedPulling="2025-12-01 10:04:19.100213672 +0000 UTC m=+3380.559600426" lastFinishedPulling="2025-12-01 10:04:22.866286854 +0000 UTC m=+3384.325673608" observedRunningTime="2025-12-01 10:04:23.202701113 +0000 UTC m=+3384.662087867" watchObservedRunningTime="2025-12-01 10:04:23.218959218 +0000 UTC m=+3384.678345972" Dec 01 10:04:26 crc kubenswrapper[4867]: I1201 10:04:26.310919 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4rkdq" Dec 01 10:04:26 crc kubenswrapper[4867]: I1201 10:04:26.312321 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4rkdq" Dec 01 10:04:26 crc kubenswrapper[4867]: I1201 10:04:26.458639 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4rkdq" Dec 01 10:04:28 crc kubenswrapper[4867]: I1201 10:04:28.294040 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4rkdq" Dec 01 10:04:28 crc kubenswrapper[4867]: I1201 10:04:28.352976 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4rkdq"] Dec 01 10:04:30 crc kubenswrapper[4867]: I1201 10:04:30.253639 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4rkdq" podUID="593fd346-565a-4e98-a9ca-2a641ee1fe09" containerName="registry-server" containerID="cri-o://c97b68720725c3a6615b085903684bea93243c3a35212919f85ba8e116085751" gracePeriod=2 Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.067933 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rkdq" Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.071723 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593fd346-565a-4e98-a9ca-2a641ee1fe09-utilities\") pod \"593fd346-565a-4e98-a9ca-2a641ee1fe09\" (UID: \"593fd346-565a-4e98-a9ca-2a641ee1fe09\") " Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.071779 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593fd346-565a-4e98-a9ca-2a641ee1fe09-catalog-content\") pod \"593fd346-565a-4e98-a9ca-2a641ee1fe09\" (UID: \"593fd346-565a-4e98-a9ca-2a641ee1fe09\") " Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.071851 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7fwc\" (UniqueName: \"kubernetes.io/projected/593fd346-565a-4e98-a9ca-2a641ee1fe09-kube-api-access-v7fwc\") pod \"593fd346-565a-4e98-a9ca-2a641ee1fe09\" (UID: \"593fd346-565a-4e98-a9ca-2a641ee1fe09\") " Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.073985 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/593fd346-565a-4e98-a9ca-2a641ee1fe09-utilities" (OuterVolumeSpecName: "utilities") pod "593fd346-565a-4e98-a9ca-2a641ee1fe09" (UID: "593fd346-565a-4e98-a9ca-2a641ee1fe09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.080032 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/593fd346-565a-4e98-a9ca-2a641ee1fe09-kube-api-access-v7fwc" (OuterVolumeSpecName: "kube-api-access-v7fwc") pod "593fd346-565a-4e98-a9ca-2a641ee1fe09" (UID: "593fd346-565a-4e98-a9ca-2a641ee1fe09"). InnerVolumeSpecName "kube-api-access-v7fwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.165966 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/593fd346-565a-4e98-a9ca-2a641ee1fe09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "593fd346-565a-4e98-a9ca-2a641ee1fe09" (UID: "593fd346-565a-4e98-a9ca-2a641ee1fe09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.174721 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593fd346-565a-4e98-a9ca-2a641ee1fe09-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.175223 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593fd346-565a-4e98-a9ca-2a641ee1fe09-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.175238 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7fwc\" (UniqueName: \"kubernetes.io/projected/593fd346-565a-4e98-a9ca-2a641ee1fe09-kube-api-access-v7fwc\") on node \"crc\" DevicePath \"\"" Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.264747 4867 generic.go:334] "Generic (PLEG): container finished" podID="593fd346-565a-4e98-a9ca-2a641ee1fe09" containerID="c97b68720725c3a6615b085903684bea93243c3a35212919f85ba8e116085751" exitCode=0 Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.264874 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rkdq" Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.264895 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rkdq" event={"ID":"593fd346-565a-4e98-a9ca-2a641ee1fe09","Type":"ContainerDied","Data":"c97b68720725c3a6615b085903684bea93243c3a35212919f85ba8e116085751"} Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.266077 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rkdq" event={"ID":"593fd346-565a-4e98-a9ca-2a641ee1fe09","Type":"ContainerDied","Data":"1e8da87588e4cab8e08ac6d48f3a8a16a511fd23a09dd49061103fec45ce8e76"} Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.266104 4867 scope.go:117] "RemoveContainer" containerID="c97b68720725c3a6615b085903684bea93243c3a35212919f85ba8e116085751" Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.313866 4867 scope.go:117] "RemoveContainer" containerID="e0ea569f44c0a7f84f51927e5c5550a9efe88a29adf606830ac181f8380c6fe1" Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.321020 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4rkdq"] Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.330277 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4rkdq"] Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.341112 4867 scope.go:117] "RemoveContainer" containerID="9e9d880fb396ef0352e8c7578908e4ac2a2e6b2e970148477aa797ef5ad3c548" Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.389676 4867 scope.go:117] "RemoveContainer" containerID="c97b68720725c3a6615b085903684bea93243c3a35212919f85ba8e116085751" Dec 01 10:04:31 crc kubenswrapper[4867]: E1201 10:04:31.390447 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c97b68720725c3a6615b085903684bea93243c3a35212919f85ba8e116085751\": container with ID starting with c97b68720725c3a6615b085903684bea93243c3a35212919f85ba8e116085751 not found: ID does not exist" containerID="c97b68720725c3a6615b085903684bea93243c3a35212919f85ba8e116085751" Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.390515 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97b68720725c3a6615b085903684bea93243c3a35212919f85ba8e116085751"} err="failed to get container status \"c97b68720725c3a6615b085903684bea93243c3a35212919f85ba8e116085751\": rpc error: code = NotFound desc = could not find container \"c97b68720725c3a6615b085903684bea93243c3a35212919f85ba8e116085751\": container with ID starting with c97b68720725c3a6615b085903684bea93243c3a35212919f85ba8e116085751 not found: ID does not exist" Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.390548 4867 scope.go:117] "RemoveContainer" containerID="e0ea569f44c0a7f84f51927e5c5550a9efe88a29adf606830ac181f8380c6fe1" Dec 01 10:04:31 crc kubenswrapper[4867]: E1201 10:04:31.391185 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0ea569f44c0a7f84f51927e5c5550a9efe88a29adf606830ac181f8380c6fe1\": container with ID starting with e0ea569f44c0a7f84f51927e5c5550a9efe88a29adf606830ac181f8380c6fe1 not found: ID does not exist" containerID="e0ea569f44c0a7f84f51927e5c5550a9efe88a29adf606830ac181f8380c6fe1" Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.391231 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0ea569f44c0a7f84f51927e5c5550a9efe88a29adf606830ac181f8380c6fe1"} err="failed to get container status \"e0ea569f44c0a7f84f51927e5c5550a9efe88a29adf606830ac181f8380c6fe1\": rpc error: code = NotFound desc = could not find container \"e0ea569f44c0a7f84f51927e5c5550a9efe88a29adf606830ac181f8380c6fe1\": container with ID starting with e0ea569f44c0a7f84f51927e5c5550a9efe88a29adf606830ac181f8380c6fe1 not found: ID does not exist" Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.391249 4867 scope.go:117] "RemoveContainer" containerID="9e9d880fb396ef0352e8c7578908e4ac2a2e6b2e970148477aa797ef5ad3c548" Dec 01 10:04:31 crc kubenswrapper[4867]: E1201 10:04:31.391728 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e9d880fb396ef0352e8c7578908e4ac2a2e6b2e970148477aa797ef5ad3c548\": container with ID starting with 9e9d880fb396ef0352e8c7578908e4ac2a2e6b2e970148477aa797ef5ad3c548 not found: ID does not exist" containerID="9e9d880fb396ef0352e8c7578908e4ac2a2e6b2e970148477aa797ef5ad3c548" Dec 01 10:04:31 crc kubenswrapper[4867]: I1201 10:04:31.391769 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e9d880fb396ef0352e8c7578908e4ac2a2e6b2e970148477aa797ef5ad3c548"} err="failed to get container status \"9e9d880fb396ef0352e8c7578908e4ac2a2e6b2e970148477aa797ef5ad3c548\": rpc error: code = NotFound desc = could not find container \"9e9d880fb396ef0352e8c7578908e4ac2a2e6b2e970148477aa797ef5ad3c548\": container with ID starting with 9e9d880fb396ef0352e8c7578908e4ac2a2e6b2e970148477aa797ef5ad3c548 not found: ID does not exist" Dec 01 10:04:32 crc kubenswrapper[4867]: I1201 10:04:32.839269 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="593fd346-565a-4e98-a9ca-2a641ee1fe09" path="/var/lib/kubelet/pods/593fd346-565a-4e98-a9ca-2a641ee1fe09/volumes" Dec 01 10:04:34 crc kubenswrapper[4867]: I1201 10:04:34.522535 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vn2pz"] Dec 01 10:04:34 crc kubenswrapper[4867]: E1201 10:04:34.523762 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="593fd346-565a-4e98-a9ca-2a641ee1fe09" containerName="registry-server" Dec 01 10:04:34 crc kubenswrapper[4867]: I1201 10:04:34.523855 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="593fd346-565a-4e98-a9ca-2a641ee1fe09" containerName="registry-server" Dec 01 10:04:34 crc kubenswrapper[4867]: E1201 10:04:34.523927 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="593fd346-565a-4e98-a9ca-2a641ee1fe09" containerName="extract-content" Dec 01 10:04:34 crc kubenswrapper[4867]: I1201 10:04:34.524010 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="593fd346-565a-4e98-a9ca-2a641ee1fe09" containerName="extract-content" Dec 01 10:04:34 crc kubenswrapper[4867]: E1201 10:04:34.524091 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="593fd346-565a-4e98-a9ca-2a641ee1fe09" containerName="extract-utilities" Dec 01 10:04:34 crc kubenswrapper[4867]: I1201 10:04:34.524150 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="593fd346-565a-4e98-a9ca-2a641ee1fe09" containerName="extract-utilities" Dec 01 10:04:34 crc kubenswrapper[4867]: I1201 10:04:34.524396 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="593fd346-565a-4e98-a9ca-2a641ee1fe09" containerName="registry-server" Dec 01 10:04:34 crc kubenswrapper[4867]: I1201 10:04:34.525795 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vn2pz" Dec 01 10:04:34 crc kubenswrapper[4867]: I1201 10:04:34.539333 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vn2pz"] Dec 01 10:04:34 crc kubenswrapper[4867]: I1201 10:04:34.641734 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f166646-ff8c-4ea9-aa73-b07dcafc0eb5-utilities\") pod \"redhat-marketplace-vn2pz\" (UID: \"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5\") " pod="openshift-marketplace/redhat-marketplace-vn2pz" Dec 01 10:04:34 crc kubenswrapper[4867]: I1201 10:04:34.641888 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpxsp\" (UniqueName: \"kubernetes.io/projected/6f166646-ff8c-4ea9-aa73-b07dcafc0eb5-kube-api-access-hpxsp\") pod \"redhat-marketplace-vn2pz\" (UID: \"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5\") " pod="openshift-marketplace/redhat-marketplace-vn2pz" Dec 01 10:04:34 crc kubenswrapper[4867]: I1201 10:04:34.641926 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f166646-ff8c-4ea9-aa73-b07dcafc0eb5-catalog-content\") pod \"redhat-marketplace-vn2pz\" (UID: \"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5\") " pod="openshift-marketplace/redhat-marketplace-vn2pz" Dec 01 10:04:34 crc kubenswrapper[4867]: I1201 10:04:34.743258 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpxsp\" (UniqueName: \"kubernetes.io/projected/6f166646-ff8c-4ea9-aa73-b07dcafc0eb5-kube-api-access-hpxsp\") pod \"redhat-marketplace-vn2pz\" (UID: \"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5\") " pod="openshift-marketplace/redhat-marketplace-vn2pz" Dec 01 10:04:34 crc kubenswrapper[4867]: I1201 10:04:34.743346 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f166646-ff8c-4ea9-aa73-b07dcafc0eb5-catalog-content\") pod \"redhat-marketplace-vn2pz\" (UID: \"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5\") " pod="openshift-marketplace/redhat-marketplace-vn2pz" Dec 01 10:04:34 crc kubenswrapper[4867]: I1201 10:04:34.743414 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f166646-ff8c-4ea9-aa73-b07dcafc0eb5-utilities\") pod \"redhat-marketplace-vn2pz\" (UID: \"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5\") " pod="openshift-marketplace/redhat-marketplace-vn2pz" Dec 01 10:04:34 crc kubenswrapper[4867]: I1201 10:04:34.743878 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f166646-ff8c-4ea9-aa73-b07dcafc0eb5-utilities\") pod \"redhat-marketplace-vn2pz\" (UID: \"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5\") " pod="openshift-marketplace/redhat-marketplace-vn2pz" Dec 01 10:04:34 crc kubenswrapper[4867]: I1201 10:04:34.743886 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f166646-ff8c-4ea9-aa73-b07dcafc0eb5-catalog-content\") pod \"redhat-marketplace-vn2pz\" (UID: \"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5\") " pod="openshift-marketplace/redhat-marketplace-vn2pz" Dec 01 10:04:34 crc kubenswrapper[4867]: I1201 10:04:34.814329 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpxsp\" (UniqueName: \"kubernetes.io/projected/6f166646-ff8c-4ea9-aa73-b07dcafc0eb5-kube-api-access-hpxsp\") pod \"redhat-marketplace-vn2pz\" (UID: \"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5\") " pod="openshift-marketplace/redhat-marketplace-vn2pz" Dec 01 10:04:34 crc kubenswrapper[4867]: I1201 10:04:34.847740 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vn2pz" Dec 01 10:04:35 crc kubenswrapper[4867]: I1201 10:04:35.389443 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vn2pz"] Dec 01 10:04:35 crc kubenswrapper[4867]: W1201 10:04:35.416108 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f166646_ff8c_4ea9_aa73_b07dcafc0eb5.slice/crio-67bea71f391198b47b21bd7799686b83f60606d20e8027801f9021dbf624683b WatchSource:0}: Error finding container 67bea71f391198b47b21bd7799686b83f60606d20e8027801f9021dbf624683b: Status 404 returned error can't find the container with id 67bea71f391198b47b21bd7799686b83f60606d20e8027801f9021dbf624683b Dec 01 10:04:36 crc kubenswrapper[4867]: I1201 10:04:36.312311 4867 generic.go:334] "Generic (PLEG): container finished" podID="6f166646-ff8c-4ea9-aa73-b07dcafc0eb5" containerID="7ade32c92479d850870e0081dbf146ff210c603aaef52940314b6c9cc93b2a86" exitCode=0 Dec 01 10:04:36 crc kubenswrapper[4867]: I1201 10:04:36.312418 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vn2pz" event={"ID":"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5","Type":"ContainerDied","Data":"7ade32c92479d850870e0081dbf146ff210c603aaef52940314b6c9cc93b2a86"} Dec 01 10:04:36 crc kubenswrapper[4867]: I1201 10:04:36.312600 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vn2pz" event={"ID":"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5","Type":"ContainerStarted","Data":"67bea71f391198b47b21bd7799686b83f60606d20e8027801f9021dbf624683b"} Dec 01 10:04:37 crc kubenswrapper[4867]: I1201 10:04:37.323786 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vn2pz" event={"ID":"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5","Type":"ContainerStarted","Data":"d2fb9907ac2ce426189341c3ab048ffa7e7a74eb1fe33011f66be8f4a84ffedd"} Dec 01 10:04:38 crc kubenswrapper[4867]: I1201 10:04:38.335055 4867 generic.go:334] "Generic (PLEG): container finished" podID="6f166646-ff8c-4ea9-aa73-b07dcafc0eb5" containerID="d2fb9907ac2ce426189341c3ab048ffa7e7a74eb1fe33011f66be8f4a84ffedd" exitCode=0 Dec 01 10:04:38 crc kubenswrapper[4867]: I1201 10:04:38.335126 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vn2pz" event={"ID":"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5","Type":"ContainerDied","Data":"d2fb9907ac2ce426189341c3ab048ffa7e7a74eb1fe33011f66be8f4a84ffedd"} Dec 01 10:04:40 crc kubenswrapper[4867]: I1201 10:04:40.363064 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vn2pz" event={"ID":"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5","Type":"ContainerStarted","Data":"033ce7298adb19822f47c2428eb87eb76ea3edf6094085d7a7a336be9a05bab6"} Dec 01 10:04:40 crc kubenswrapper[4867]: I1201 10:04:40.387490 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vn2pz" podStartSLOduration=3.168392366 podStartE2EDuration="6.387466648s" podCreationTimestamp="2025-12-01 10:04:34 +0000 UTC" firstStartedPulling="2025-12-01 10:04:36.315315968 +0000 UTC m=+3397.774702742" lastFinishedPulling="2025-12-01 10:04:39.53439027 +0000 UTC m=+3400.993777024" observedRunningTime="2025-12-01 10:04:40.386257014 +0000 UTC m=+3401.845643768" watchObservedRunningTime="2025-12-01 10:04:40.387466648 +0000 UTC m=+3401.846853402" Dec 01 10:04:44 crc kubenswrapper[4867]: I1201 10:04:44.848927 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vn2pz" Dec 01 10:04:44 crc kubenswrapper[4867]: I1201 10:04:44.850375 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vn2pz" Dec 01 10:04:44 crc kubenswrapper[4867]: I1201 10:04:44.902338 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vn2pz" Dec 01 10:04:45 crc kubenswrapper[4867]: I1201 10:04:45.488550 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vn2pz" Dec 01 10:04:45 crc kubenswrapper[4867]: I1201 10:04:45.545731 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vn2pz"] Dec 01 10:04:47 crc kubenswrapper[4867]: I1201 10:04:47.445407 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vn2pz" podUID="6f166646-ff8c-4ea9-aa73-b07dcafc0eb5" containerName="registry-server" containerID="cri-o://033ce7298adb19822f47c2428eb87eb76ea3edf6094085d7a7a336be9a05bab6" gracePeriod=2 Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.205375 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vn2pz" Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.212643 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpxsp\" (UniqueName: \"kubernetes.io/projected/6f166646-ff8c-4ea9-aa73-b07dcafc0eb5-kube-api-access-hpxsp\") pod \"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5\" (UID: \"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5\") " Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.212857 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f166646-ff8c-4ea9-aa73-b07dcafc0eb5-utilities\") pod \"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5\" (UID: \"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5\") " Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.213006 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f166646-ff8c-4ea9-aa73-b07dcafc0eb5-catalog-content\") pod \"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5\" (UID: \"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5\") " Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.214706 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f166646-ff8c-4ea9-aa73-b07dcafc0eb5-utilities" (OuterVolumeSpecName: "utilities") pod "6f166646-ff8c-4ea9-aa73-b07dcafc0eb5" (UID: "6f166646-ff8c-4ea9-aa73-b07dcafc0eb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.239048 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f166646-ff8c-4ea9-aa73-b07dcafc0eb5-kube-api-access-hpxsp" (OuterVolumeSpecName: "kube-api-access-hpxsp") pod "6f166646-ff8c-4ea9-aa73-b07dcafc0eb5" (UID: "6f166646-ff8c-4ea9-aa73-b07dcafc0eb5"). InnerVolumeSpecName "kube-api-access-hpxsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.244404 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f166646-ff8c-4ea9-aa73-b07dcafc0eb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f166646-ff8c-4ea9-aa73-b07dcafc0eb5" (UID: "6f166646-ff8c-4ea9-aa73-b07dcafc0eb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.317239 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpxsp\" (UniqueName: \"kubernetes.io/projected/6f166646-ff8c-4ea9-aa73-b07dcafc0eb5-kube-api-access-hpxsp\") on node \"crc\" DevicePath \"\"" Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.317280 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f166646-ff8c-4ea9-aa73-b07dcafc0eb5-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.317290 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f166646-ff8c-4ea9-aa73-b07dcafc0eb5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.459410 4867 generic.go:334] "Generic (PLEG): container finished" podID="6f166646-ff8c-4ea9-aa73-b07dcafc0eb5" containerID="033ce7298adb19822f47c2428eb87eb76ea3edf6094085d7a7a336be9a05bab6" exitCode=0 Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.459469 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vn2pz" event={"ID":"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5","Type":"ContainerDied","Data":"033ce7298adb19822f47c2428eb87eb76ea3edf6094085d7a7a336be9a05bab6"} Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.459504 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vn2pz" event={"ID":"6f166646-ff8c-4ea9-aa73-b07dcafc0eb5","Type":"ContainerDied","Data":"67bea71f391198b47b21bd7799686b83f60606d20e8027801f9021dbf624683b"} Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.459524 4867 scope.go:117] "RemoveContainer" containerID="033ce7298adb19822f47c2428eb87eb76ea3edf6094085d7a7a336be9a05bab6" Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.459703 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vn2pz" Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.515324 4867 scope.go:117] "RemoveContainer" containerID="d2fb9907ac2ce426189341c3ab048ffa7e7a74eb1fe33011f66be8f4a84ffedd" Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.516742 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vn2pz"] Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.532319 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vn2pz"] Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.539510 4867 scope.go:117] "RemoveContainer" containerID="7ade32c92479d850870e0081dbf146ff210c603aaef52940314b6c9cc93b2a86" Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.590565 4867 scope.go:117] "RemoveContainer" containerID="033ce7298adb19822f47c2428eb87eb76ea3edf6094085d7a7a336be9a05bab6" Dec 01 10:04:48 crc kubenswrapper[4867]: E1201 10:04:48.591303 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"033ce7298adb19822f47c2428eb87eb76ea3edf6094085d7a7a336be9a05bab6\": container with ID starting with 033ce7298adb19822f47c2428eb87eb76ea3edf6094085d7a7a336be9a05bab6 not found: ID does not exist" containerID="033ce7298adb19822f47c2428eb87eb76ea3edf6094085d7a7a336be9a05bab6" Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.591352 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"033ce7298adb19822f47c2428eb87eb76ea3edf6094085d7a7a336be9a05bab6"} err="failed to get container status \"033ce7298adb19822f47c2428eb87eb76ea3edf6094085d7a7a336be9a05bab6\": rpc error: code = NotFound desc = could not find container \"033ce7298adb19822f47c2428eb87eb76ea3edf6094085d7a7a336be9a05bab6\": container with ID starting with 033ce7298adb19822f47c2428eb87eb76ea3edf6094085d7a7a336be9a05bab6 not found: ID does not exist" Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.591408 4867 scope.go:117] "RemoveContainer" containerID="d2fb9907ac2ce426189341c3ab048ffa7e7a74eb1fe33011f66be8f4a84ffedd" Dec 01 10:04:48 crc kubenswrapper[4867]: E1201 10:04:48.592005 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2fb9907ac2ce426189341c3ab048ffa7e7a74eb1fe33011f66be8f4a84ffedd\": container with ID starting with d2fb9907ac2ce426189341c3ab048ffa7e7a74eb1fe33011f66be8f4a84ffedd not found: ID does not exist" containerID="d2fb9907ac2ce426189341c3ab048ffa7e7a74eb1fe33011f66be8f4a84ffedd" Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.592039 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2fb9907ac2ce426189341c3ab048ffa7e7a74eb1fe33011f66be8f4a84ffedd"} err="failed to get container status \"d2fb9907ac2ce426189341c3ab048ffa7e7a74eb1fe33011f66be8f4a84ffedd\": rpc error: code = NotFound desc = could not find container \"d2fb9907ac2ce426189341c3ab048ffa7e7a74eb1fe33011f66be8f4a84ffedd\": container with ID starting with d2fb9907ac2ce426189341c3ab048ffa7e7a74eb1fe33011f66be8f4a84ffedd not found: ID does not exist" Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.592055 4867 scope.go:117] "RemoveContainer" containerID="7ade32c92479d850870e0081dbf146ff210c603aaef52940314b6c9cc93b2a86" Dec 01 10:04:48 crc kubenswrapper[4867]: E1201 10:04:48.592341 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ade32c92479d850870e0081dbf146ff210c603aaef52940314b6c9cc93b2a86\": container with ID starting with 7ade32c92479d850870e0081dbf146ff210c603aaef52940314b6c9cc93b2a86 not found: ID does not exist" containerID="7ade32c92479d850870e0081dbf146ff210c603aaef52940314b6c9cc93b2a86" Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.592373 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ade32c92479d850870e0081dbf146ff210c603aaef52940314b6c9cc93b2a86"} err="failed to get container status \"7ade32c92479d850870e0081dbf146ff210c603aaef52940314b6c9cc93b2a86\": rpc error: code = NotFound desc = could not find container \"7ade32c92479d850870e0081dbf146ff210c603aaef52940314b6c9cc93b2a86\": container with ID starting with 7ade32c92479d850870e0081dbf146ff210c603aaef52940314b6c9cc93b2a86 not found: ID does not exist" Dec 01 10:04:48 crc kubenswrapper[4867]: I1201 10:04:48.839207 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f166646-ff8c-4ea9-aa73-b07dcafc0eb5" path="/var/lib/kubelet/pods/6f166646-ff8c-4ea9-aa73-b07dcafc0eb5/volumes" Dec 01 10:04:51 crc kubenswrapper[4867]: I1201 10:04:51.601044 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:04:51 crc kubenswrapper[4867]: I1201 10:04:51.602524 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:05:21 crc kubenswrapper[4867]: I1201 10:05:21.601436 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:05:21 crc kubenswrapper[4867]: I1201 10:05:21.603139 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:05:21 crc kubenswrapper[4867]: I1201 10:05:21.603268 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 10:05:21 crc kubenswrapper[4867]: I1201 10:05:21.604141 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92327f65a1be78e5e4a126caf146bbc31afd87bb225d5a46125eb70cdf8d833b"} pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:05:21 crc kubenswrapper[4867]: I1201 10:05:21.604326 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" containerID="cri-o://92327f65a1be78e5e4a126caf146bbc31afd87bb225d5a46125eb70cdf8d833b" gracePeriod=600 Dec 01 10:05:21 crc kubenswrapper[4867]: I1201 10:05:21.763412 4867 generic.go:334] "Generic (PLEG): container finished" podID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerID="92327f65a1be78e5e4a126caf146bbc31afd87bb225d5a46125eb70cdf8d833b" exitCode=0 Dec 01 10:05:21 crc kubenswrapper[4867]: I1201 10:05:21.763728 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerDied","Data":"92327f65a1be78e5e4a126caf146bbc31afd87bb225d5a46125eb70cdf8d833b"} Dec 01 10:05:21 crc kubenswrapper[4867]: I1201 10:05:21.763882 4867 scope.go:117] "RemoveContainer" containerID="3ffbbf5169694054c04597cbf390ffe12bc73250fe5f69041cd230c91df2d1d8" Dec 01 10:05:22 crc kubenswrapper[4867]: I1201 10:05:22.776503 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b"} Dec 01 10:06:45 crc kubenswrapper[4867]: I1201 10:06:45.098675 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-54smd"] Dec 01 10:06:45 crc kubenswrapper[4867]: E1201 10:06:45.099592 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f166646-ff8c-4ea9-aa73-b07dcafc0eb5" containerName="registry-server" Dec 01 10:06:45 crc kubenswrapper[4867]: I1201 10:06:45.099606 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f166646-ff8c-4ea9-aa73-b07dcafc0eb5" containerName="registry-server" Dec 01 10:06:45 crc kubenswrapper[4867]: E1201 10:06:45.099628 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f166646-ff8c-4ea9-aa73-b07dcafc0eb5" containerName="extract-utilities" Dec 01 10:06:45 crc kubenswrapper[4867]: I1201 10:06:45.099634 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f166646-ff8c-4ea9-aa73-b07dcafc0eb5" containerName="extract-utilities" Dec 01 10:06:45 crc kubenswrapper[4867]: E1201 10:06:45.099678 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f166646-ff8c-4ea9-aa73-b07dcafc0eb5" containerName="extract-content" Dec 01 10:06:45 crc kubenswrapper[4867]: I1201 10:06:45.099684 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f166646-ff8c-4ea9-aa73-b07dcafc0eb5" containerName="extract-content" Dec 01 10:06:45 crc kubenswrapper[4867]: I1201 10:06:45.099966 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f166646-ff8c-4ea9-aa73-b07dcafc0eb5" containerName="registry-server" Dec 01 10:06:45 crc kubenswrapper[4867]: I1201 10:06:45.101567 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-54smd" Dec 01 10:06:45 crc kubenswrapper[4867]: I1201 10:06:45.119935 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-54smd"] Dec 01 10:06:45 crc kubenswrapper[4867]: I1201 10:06:45.167695 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7c3f527-e2e1-4a92-b2dc-76cb294f84a6-utilities\") pod \"redhat-operators-54smd\" (UID: \"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6\") " pod="openshift-marketplace/redhat-operators-54smd" Dec 01 10:06:45 crc kubenswrapper[4867]: I1201 10:06:45.167790 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctn6d\" (UniqueName: \"kubernetes.io/projected/a7c3f527-e2e1-4a92-b2dc-76cb294f84a6-kube-api-access-ctn6d\") pod \"redhat-operators-54smd\" (UID: \"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6\") " pod="openshift-marketplace/redhat-operators-54smd" Dec 01 10:06:45 crc kubenswrapper[4867]: I1201 10:06:45.168046 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7c3f527-e2e1-4a92-b2dc-76cb294f84a6-catalog-content\") pod \"redhat-operators-54smd\" (UID: \"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6\") " pod="openshift-marketplace/redhat-operators-54smd" Dec 01 10:06:45 crc kubenswrapper[4867]: I1201 10:06:45.269555 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctn6d\" (UniqueName: \"kubernetes.io/projected/a7c3f527-e2e1-4a92-b2dc-76cb294f84a6-kube-api-access-ctn6d\") pod \"redhat-operators-54smd\" (UID: \"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6\") " pod="openshift-marketplace/redhat-operators-54smd" Dec 01 10:06:45 crc kubenswrapper[4867]: I1201 10:06:45.269680 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7c3f527-e2e1-4a92-b2dc-76cb294f84a6-catalog-content\") pod \"redhat-operators-54smd\" (UID: \"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6\") " pod="openshift-marketplace/redhat-operators-54smd" Dec 01 10:06:45 crc kubenswrapper[4867]: I1201 10:06:45.269943 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7c3f527-e2e1-4a92-b2dc-76cb294f84a6-utilities\") pod \"redhat-operators-54smd\" (UID: \"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6\") " pod="openshift-marketplace/redhat-operators-54smd" Dec 01 10:06:45 crc kubenswrapper[4867]: I1201 10:06:45.270218 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7c3f527-e2e1-4a92-b2dc-76cb294f84a6-catalog-content\") pod \"redhat-operators-54smd\" (UID: \"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6\") " pod="openshift-marketplace/redhat-operators-54smd" Dec 01 10:06:45 crc kubenswrapper[4867]: I1201 10:06:45.270350 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7c3f527-e2e1-4a92-b2dc-76cb294f84a6-utilities\") pod \"redhat-operators-54smd\" (UID: \"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6\") " pod="openshift-marketplace/redhat-operators-54smd" Dec 01 10:06:45 crc kubenswrapper[4867]: I1201 10:06:45.288645 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctn6d\" (UniqueName: \"kubernetes.io/projected/a7c3f527-e2e1-4a92-b2dc-76cb294f84a6-kube-api-access-ctn6d\") pod \"redhat-operators-54smd\" (UID: \"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6\") " pod="openshift-marketplace/redhat-operators-54smd" Dec 01 10:06:45 crc kubenswrapper[4867]: I1201 10:06:45.431254 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-54smd" Dec 01 10:06:45 crc kubenswrapper[4867]: I1201 10:06:45.997503 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-54smd"] Dec 01 10:06:45 crc kubenswrapper[4867]: W1201 10:06:45.998615 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7c3f527_e2e1_4a92_b2dc_76cb294f84a6.slice/crio-552aafbc1e11d24c7f7a64296c2adb3c6e6477bb27010ea96992c890bafd94ab WatchSource:0}: Error finding container 552aafbc1e11d24c7f7a64296c2adb3c6e6477bb27010ea96992c890bafd94ab: Status 404 returned error can't find the container with id 552aafbc1e11d24c7f7a64296c2adb3c6e6477bb27010ea96992c890bafd94ab Dec 01 10:06:46 crc kubenswrapper[4867]: I1201 10:06:46.524658 4867 generic.go:334] "Generic (PLEG): container finished" podID="a7c3f527-e2e1-4a92-b2dc-76cb294f84a6" containerID="2384f17427db5fb292830cfd1e00e30d3dd9916a8453c25be40d4172bec84758" exitCode=0 Dec 01 10:06:46 crc kubenswrapper[4867]: I1201 10:06:46.524736 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54smd" event={"ID":"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6","Type":"ContainerDied","Data":"2384f17427db5fb292830cfd1e00e30d3dd9916a8453c25be40d4172bec84758"} Dec 01 10:06:46 crc kubenswrapper[4867]: I1201 10:06:46.524991 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54smd" event={"ID":"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6","Type":"ContainerStarted","Data":"552aafbc1e11d24c7f7a64296c2adb3c6e6477bb27010ea96992c890bafd94ab"} Dec 01 10:06:46 crc kubenswrapper[4867]: I1201 10:06:46.527204 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:06:59 crc kubenswrapper[4867]: I1201 10:06:59.683683 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54smd" event={"ID":"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6","Type":"ContainerStarted","Data":"98956a490afdca9304dc98ee31dad6bcf1ece77a58ab61eb7b36bc0be316f37b"} Dec 01 10:07:02 crc kubenswrapper[4867]: I1201 10:07:02.710767 4867 generic.go:334] "Generic (PLEG): container finished" podID="a7c3f527-e2e1-4a92-b2dc-76cb294f84a6" containerID="98956a490afdca9304dc98ee31dad6bcf1ece77a58ab61eb7b36bc0be316f37b" exitCode=0 Dec 01 10:07:02 crc kubenswrapper[4867]: I1201 10:07:02.710853 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54smd" event={"ID":"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6","Type":"ContainerDied","Data":"98956a490afdca9304dc98ee31dad6bcf1ece77a58ab61eb7b36bc0be316f37b"} Dec 01 10:07:03 crc kubenswrapper[4867]: I1201 10:07:03.721482 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54smd" event={"ID":"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6","Type":"ContainerStarted","Data":"62b683845159d532a0d256c67c241bd4126bfc6e3ea5aad26d618334dde3f684"} Dec 01 10:07:03 crc kubenswrapper[4867]: I1201 10:07:03.746698 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-54smd" podStartSLOduration=2.123098699 podStartE2EDuration="18.746679083s" podCreationTimestamp="2025-12-01 10:06:45 +0000 UTC" firstStartedPulling="2025-12-01 10:06:46.526358497 +0000 UTC m=+3527.985745251" lastFinishedPulling="2025-12-01 10:07:03.149938881 +0000 UTC m=+3544.609325635" observedRunningTime="2025-12-01 10:07:03.738956961 +0000 UTC m=+3545.198343715" watchObservedRunningTime="2025-12-01 10:07:03.746679083 +0000 UTC m=+3545.206065837" Dec 01 10:07:05 crc kubenswrapper[4867]: I1201 10:07:05.431685 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-54smd" Dec 01 10:07:05 crc kubenswrapper[4867]: I1201 10:07:05.432057 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-54smd" Dec 01 10:07:06 crc kubenswrapper[4867]: I1201 10:07:06.475561 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-54smd" podUID="a7c3f527-e2e1-4a92-b2dc-76cb294f84a6" containerName="registry-server" probeResult="failure" output=< Dec 01 10:07:06 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Dec 01 10:07:06 crc kubenswrapper[4867]: > Dec 01 10:07:15 crc kubenswrapper[4867]: I1201 10:07:15.489800 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-54smd" Dec 01 10:07:15 crc kubenswrapper[4867]: I1201 10:07:15.551100 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-54smd" Dec 01 10:07:16 crc kubenswrapper[4867]: I1201 10:07:16.148104 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-54smd"] Dec 01 10:07:16 crc kubenswrapper[4867]: I1201 10:07:16.300787 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tzz5n"] Dec 01 10:07:16 crc kubenswrapper[4867]: I1201 10:07:16.301283 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tzz5n" podUID="9dd83b39-8ab6-4e60-9ff6-53129612dff2" containerName="registry-server" containerID="cri-o://a8cbfc39946c027dde8cc2267dfe39fd842fd3db140992ed421df680fd7d77d5" gracePeriod=2 Dec 01 10:07:16 crc kubenswrapper[4867]: I1201 10:07:16.898189 4867 generic.go:334] "Generic (PLEG): container finished" podID="9dd83b39-8ab6-4e60-9ff6-53129612dff2" containerID="a8cbfc39946c027dde8cc2267dfe39fd842fd3db140992ed421df680fd7d77d5" exitCode=0 Dec 01 10:07:16 crc kubenswrapper[4867]: I1201 10:07:16.899062 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzz5n" event={"ID":"9dd83b39-8ab6-4e60-9ff6-53129612dff2","Type":"ContainerDied","Data":"a8cbfc39946c027dde8cc2267dfe39fd842fd3db140992ed421df680fd7d77d5"} Dec 01 10:07:17 crc kubenswrapper[4867]: I1201 10:07:17.184659 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tzz5n" Dec 01 10:07:17 crc kubenswrapper[4867]: I1201 10:07:17.280335 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t72p\" (UniqueName: \"kubernetes.io/projected/9dd83b39-8ab6-4e60-9ff6-53129612dff2-kube-api-access-2t72p\") pod \"9dd83b39-8ab6-4e60-9ff6-53129612dff2\" (UID: \"9dd83b39-8ab6-4e60-9ff6-53129612dff2\") " Dec 01 10:07:17 crc kubenswrapper[4867]: I1201 10:07:17.280513 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dd83b39-8ab6-4e60-9ff6-53129612dff2-utilities\") pod \"9dd83b39-8ab6-4e60-9ff6-53129612dff2\" (UID: \"9dd83b39-8ab6-4e60-9ff6-53129612dff2\") " Dec 01 10:07:17 crc kubenswrapper[4867]: I1201 10:07:17.280740 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dd83b39-8ab6-4e60-9ff6-53129612dff2-catalog-content\") pod \"9dd83b39-8ab6-4e60-9ff6-53129612dff2\" (UID: \"9dd83b39-8ab6-4e60-9ff6-53129612dff2\") " Dec 01 10:07:17 crc kubenswrapper[4867]: I1201 10:07:17.282598 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dd83b39-8ab6-4e60-9ff6-53129612dff2-utilities" (OuterVolumeSpecName: "utilities") pod "9dd83b39-8ab6-4e60-9ff6-53129612dff2" (UID: "9dd83b39-8ab6-4e60-9ff6-53129612dff2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:07:17 crc kubenswrapper[4867]: I1201 10:07:17.311489 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dd83b39-8ab6-4e60-9ff6-53129612dff2-kube-api-access-2t72p" (OuterVolumeSpecName: "kube-api-access-2t72p") pod "9dd83b39-8ab6-4e60-9ff6-53129612dff2" (UID: "9dd83b39-8ab6-4e60-9ff6-53129612dff2"). InnerVolumeSpecName "kube-api-access-2t72p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:07:17 crc kubenswrapper[4867]: I1201 10:07:17.382910 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dd83b39-8ab6-4e60-9ff6-53129612dff2-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:07:17 crc kubenswrapper[4867]: I1201 10:07:17.383280 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t72p\" (UniqueName: \"kubernetes.io/projected/9dd83b39-8ab6-4e60-9ff6-53129612dff2-kube-api-access-2t72p\") on node \"crc\" DevicePath \"\"" Dec 01 10:07:17 crc kubenswrapper[4867]: I1201 10:07:17.476321 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dd83b39-8ab6-4e60-9ff6-53129612dff2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9dd83b39-8ab6-4e60-9ff6-53129612dff2" (UID: "9dd83b39-8ab6-4e60-9ff6-53129612dff2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:07:17 crc kubenswrapper[4867]: I1201 10:07:17.485209 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dd83b39-8ab6-4e60-9ff6-53129612dff2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:07:17 crc kubenswrapper[4867]: I1201 10:07:17.911268 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzz5n" event={"ID":"9dd83b39-8ab6-4e60-9ff6-53129612dff2","Type":"ContainerDied","Data":"fb75eff09257f0c167b1d2d9d10e2b485074459970b3e0b8235e004e257dcc96"} Dec 01 10:07:17 crc kubenswrapper[4867]: I1201 10:07:17.911413 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tzz5n" Dec 01 10:07:17 crc kubenswrapper[4867]: I1201 10:07:17.912078 4867 scope.go:117] "RemoveContainer" containerID="a8cbfc39946c027dde8cc2267dfe39fd842fd3db140992ed421df680fd7d77d5" Dec 01 10:07:17 crc kubenswrapper[4867]: I1201 10:07:17.960864 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tzz5n"] Dec 01 10:07:17 crc kubenswrapper[4867]: I1201 10:07:17.970690 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tzz5n"] Dec 01 10:07:17 crc kubenswrapper[4867]: I1201 10:07:17.971750 4867 scope.go:117] "RemoveContainer" containerID="e65e3c90b12e72a1f4774cde19986d8546588bf943a14e8ec755d3f3c11688b8" Dec 01 10:07:18 crc kubenswrapper[4867]: I1201 10:07:18.024847 4867 scope.go:117] "RemoveContainer" containerID="1170a7643f0cdc92c2664fe1ff7e6069be403af36c758ad0b1c168fab7584a70" Dec 01 10:07:18 crc kubenswrapper[4867]: I1201 10:07:18.845424 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dd83b39-8ab6-4e60-9ff6-53129612dff2" path="/var/lib/kubelet/pods/9dd83b39-8ab6-4e60-9ff6-53129612dff2/volumes" Dec 01 10:07:21 crc kubenswrapper[4867]: I1201 10:07:21.601888 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:07:21 crc kubenswrapper[4867]: I1201 10:07:21.602508 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:07:51 crc kubenswrapper[4867]: I1201 10:07:51.601252 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:07:51 crc kubenswrapper[4867]: I1201 10:07:51.601775 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:08:21 crc kubenswrapper[4867]: I1201 10:08:21.601830 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:08:21 crc kubenswrapper[4867]: I1201 10:08:21.602449 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:08:21 crc kubenswrapper[4867]: I1201 10:08:21.602509 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 10:08:21 crc kubenswrapper[4867]: I1201 10:08:21.603403 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b"} pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:08:21 crc kubenswrapper[4867]: I1201 10:08:21.603475 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" containerID="cri-o://167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" gracePeriod=600 Dec 01 10:08:22 crc kubenswrapper[4867]: E1201 10:08:22.229218 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:08:22 crc kubenswrapper[4867]: I1201 10:08:22.511122 4867 generic.go:334] "Generic (PLEG): container finished" podID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" exitCode=0 Dec 01 10:08:22 crc kubenswrapper[4867]: I1201 10:08:22.511177 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerDied","Data":"167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b"} Dec 01 10:08:22 crc kubenswrapper[4867]: I1201 10:08:22.511215 4867 scope.go:117] "RemoveContainer" containerID="92327f65a1be78e5e4a126caf146bbc31afd87bb225d5a46125eb70cdf8d833b" Dec 01 10:08:22 crc kubenswrapper[4867]: I1201 10:08:22.512000 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:08:22 crc kubenswrapper[4867]: E1201 10:08:22.512489 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:08:37 crc kubenswrapper[4867]: I1201 10:08:37.826940 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:08:37 crc kubenswrapper[4867]: E1201 10:08:37.827862 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:08:50 crc kubenswrapper[4867]: I1201 10:08:50.827309 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:08:50 crc kubenswrapper[4867]: E1201 10:08:50.827979 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:09:01 crc kubenswrapper[4867]: I1201 10:09:01.827117 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:09:01 crc kubenswrapper[4867]: E1201 10:09:01.827762 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:09:16 crc kubenswrapper[4867]: I1201 10:09:16.826470 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:09:16 crc kubenswrapper[4867]: E1201 10:09:16.827170 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:09:28 crc kubenswrapper[4867]: I1201 10:09:28.836205 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:09:28 crc kubenswrapper[4867]: E1201 10:09:28.837139 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:09:43 crc kubenswrapper[4867]: I1201 10:09:43.827224 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:09:43 crc kubenswrapper[4867]: E1201 10:09:43.827993 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:09:54 crc kubenswrapper[4867]: I1201 10:09:54.828638 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:09:54 crc kubenswrapper[4867]: E1201 10:09:54.829601 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:10:08 crc kubenswrapper[4867]: I1201 10:10:08.832771 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:10:08 crc kubenswrapper[4867]: E1201 10:10:08.833731 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:10:20 crc kubenswrapper[4867]: I1201 10:10:20.827473 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:10:20 crc kubenswrapper[4867]: E1201 10:10:20.828737 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:10:32 crc kubenswrapper[4867]: I1201 10:10:32.827583 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:10:32 crc kubenswrapper[4867]: E1201 10:10:32.828569 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:10:46 crc kubenswrapper[4867]: I1201 10:10:46.827610 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:10:46 crc kubenswrapper[4867]: E1201 10:10:46.828580 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:11:01 crc kubenswrapper[4867]: I1201 10:11:01.826263 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:11:01 crc kubenswrapper[4867]: E1201 10:11:01.826843 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:11:14 crc kubenswrapper[4867]: I1201 10:11:14.827481 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:11:14 crc kubenswrapper[4867]: E1201 10:11:14.828312 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:11:28 crc kubenswrapper[4867]: I1201 10:11:28.838301 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:11:28 crc kubenswrapper[4867]: E1201 10:11:28.838933 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:11:42 crc kubenswrapper[4867]: I1201 10:11:42.826942 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:11:42 crc kubenswrapper[4867]: E1201 10:11:42.827802 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:11:54 crc kubenswrapper[4867]: I1201 10:11:54.827564 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:11:54 crc kubenswrapper[4867]: E1201 10:11:54.828309 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:12:07 crc kubenswrapper[4867]: I1201 10:12:07.827666 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:12:07 crc kubenswrapper[4867]: E1201 10:12:07.828513 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:12:19 crc kubenswrapper[4867]: I1201 10:12:19.833216 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:12:19 crc kubenswrapper[4867]: E1201 10:12:19.835927 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:12:34 crc kubenswrapper[4867]: I1201 10:12:34.827129 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:12:34 crc kubenswrapper[4867]: E1201 10:12:34.828618 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:12:49 crc kubenswrapper[4867]: I1201 10:12:49.828149 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:12:49 crc kubenswrapper[4867]: E1201 10:12:49.830566 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:13:00 crc kubenswrapper[4867]: I1201 10:13:00.829793 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:13:00 crc kubenswrapper[4867]: E1201 10:13:00.830637 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:13:12 crc kubenswrapper[4867]: I1201 10:13:12.832608 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:13:12 crc kubenswrapper[4867]: E1201 10:13:12.833452 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:13:26 crc kubenswrapper[4867]: I1201 10:13:26.828160 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:13:28 crc kubenswrapper[4867]: I1201 10:13:28.055176 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"7038d3231484ebe59527ae4898f9d6a56414acfb58f056d0591d36edea976f2b"} Dec 01 10:13:46 crc kubenswrapper[4867]: I1201 10:13:46.277988 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8x9hh"] Dec 01 10:13:46 crc kubenswrapper[4867]: E1201 10:13:46.279581 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd83b39-8ab6-4e60-9ff6-53129612dff2" containerName="registry-server" Dec 01 10:13:46 crc kubenswrapper[4867]: I1201 10:13:46.279649 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd83b39-8ab6-4e60-9ff6-53129612dff2" containerName="registry-server" Dec 01 10:13:46 crc kubenswrapper[4867]: E1201 10:13:46.279717 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd83b39-8ab6-4e60-9ff6-53129612dff2" containerName="extract-content" Dec 01 10:13:46 crc kubenswrapper[4867]: I1201 10:13:46.279763 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd83b39-8ab6-4e60-9ff6-53129612dff2" containerName="extract-content" Dec 01 10:13:46 crc kubenswrapper[4867]: E1201 10:13:46.279833 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd83b39-8ab6-4e60-9ff6-53129612dff2" containerName="extract-utilities" Dec 01 10:13:46 crc kubenswrapper[4867]: I1201 10:13:46.279890 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd83b39-8ab6-4e60-9ff6-53129612dff2" containerName="extract-utilities" Dec 01 10:13:46 crc kubenswrapper[4867]: I1201 10:13:46.280162 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dd83b39-8ab6-4e60-9ff6-53129612dff2" containerName="registry-server" Dec 01 10:13:46 crc kubenswrapper[4867]: I1201 10:13:46.281599 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8x9hh" Dec 01 10:13:46 crc kubenswrapper[4867]: I1201 10:13:46.299848 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8x9hh"] Dec 01 10:13:46 crc kubenswrapper[4867]: I1201 10:13:46.304388 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c6c316b-bbb7-4e56-bced-aed519dec778-utilities\") pod \"community-operators-8x9hh\" (UID: \"6c6c316b-bbb7-4e56-bced-aed519dec778\") " pod="openshift-marketplace/community-operators-8x9hh" Dec 01 10:13:46 crc kubenswrapper[4867]: I1201 10:13:46.304488 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p65n2\" (UniqueName: \"kubernetes.io/projected/6c6c316b-bbb7-4e56-bced-aed519dec778-kube-api-access-p65n2\") pod \"community-operators-8x9hh\" (UID: \"6c6c316b-bbb7-4e56-bced-aed519dec778\") " pod="openshift-marketplace/community-operators-8x9hh" Dec 01 10:13:46 crc kubenswrapper[4867]: I1201 10:13:46.304601 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c6c316b-bbb7-4e56-bced-aed519dec778-catalog-content\") pod \"community-operators-8x9hh\" (UID: \"6c6c316b-bbb7-4e56-bced-aed519dec778\") " pod="openshift-marketplace/community-operators-8x9hh" Dec 01 10:13:46 crc kubenswrapper[4867]: I1201 10:13:46.406613 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c6c316b-bbb7-4e56-bced-aed519dec778-catalog-content\") pod \"community-operators-8x9hh\" (UID: \"6c6c316b-bbb7-4e56-bced-aed519dec778\") " pod="openshift-marketplace/community-operators-8x9hh" Dec 01 10:13:46 crc kubenswrapper[4867]: I1201 10:13:46.406699 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c6c316b-bbb7-4e56-bced-aed519dec778-utilities\") pod \"community-operators-8x9hh\" (UID: \"6c6c316b-bbb7-4e56-bced-aed519dec778\") " pod="openshift-marketplace/community-operators-8x9hh" Dec 01 10:13:46 crc kubenswrapper[4867]: I1201 10:13:46.406771 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p65n2\" (UniqueName: \"kubernetes.io/projected/6c6c316b-bbb7-4e56-bced-aed519dec778-kube-api-access-p65n2\") pod \"community-operators-8x9hh\" (UID: \"6c6c316b-bbb7-4e56-bced-aed519dec778\") " pod="openshift-marketplace/community-operators-8x9hh" Dec 01 10:13:46 crc kubenswrapper[4867]: I1201 10:13:46.407274 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c6c316b-bbb7-4e56-bced-aed519dec778-catalog-content\") pod \"community-operators-8x9hh\" (UID: \"6c6c316b-bbb7-4e56-bced-aed519dec778\") " pod="openshift-marketplace/community-operators-8x9hh" Dec 01 10:13:46 crc kubenswrapper[4867]: I1201 10:13:46.407870 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c6c316b-bbb7-4e56-bced-aed519dec778-utilities\") pod \"community-operators-8x9hh\" (UID: \"6c6c316b-bbb7-4e56-bced-aed519dec778\") " pod="openshift-marketplace/community-operators-8x9hh" Dec 01 10:13:46 crc kubenswrapper[4867]: I1201 10:13:46.429501 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p65n2\" (UniqueName: \"kubernetes.io/projected/6c6c316b-bbb7-4e56-bced-aed519dec778-kube-api-access-p65n2\") pod \"community-operators-8x9hh\" (UID: \"6c6c316b-bbb7-4e56-bced-aed519dec778\") " pod="openshift-marketplace/community-operators-8x9hh" Dec 01 10:13:46 crc kubenswrapper[4867]: I1201 10:13:46.605527 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8x9hh" Dec 01 10:13:47 crc kubenswrapper[4867]: I1201 10:13:47.148045 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8x9hh"] Dec 01 10:13:47 crc kubenswrapper[4867]: W1201 10:13:47.152776 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c6c316b_bbb7_4e56_bced_aed519dec778.slice/crio-8c89e8ee4268835d0b90db2e5f7e63c9999a1dabb46ede7b58ea9facc0b599a3 WatchSource:0}: Error finding container 8c89e8ee4268835d0b90db2e5f7e63c9999a1dabb46ede7b58ea9facc0b599a3: Status 404 returned error can't find the container with id 8c89e8ee4268835d0b90db2e5f7e63c9999a1dabb46ede7b58ea9facc0b599a3 Dec 01 10:13:47 crc kubenswrapper[4867]: I1201 10:13:47.249670 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8x9hh" event={"ID":"6c6c316b-bbb7-4e56-bced-aed519dec778","Type":"ContainerStarted","Data":"8c89e8ee4268835d0b90db2e5f7e63c9999a1dabb46ede7b58ea9facc0b599a3"} Dec 01 10:13:48 crc kubenswrapper[4867]: I1201 10:13:48.276615 4867 generic.go:334] "Generic (PLEG): container finished" podID="6c6c316b-bbb7-4e56-bced-aed519dec778" containerID="461fe2ad8bb87644a978cf665a88c0802f469e8b9030cae1c6d3c1e7b1ae65ec" exitCode=0 Dec 01 10:13:48 crc kubenswrapper[4867]: I1201 10:13:48.276703 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8x9hh" event={"ID":"6c6c316b-bbb7-4e56-bced-aed519dec778","Type":"ContainerDied","Data":"461fe2ad8bb87644a978cf665a88c0802f469e8b9030cae1c6d3c1e7b1ae65ec"} Dec 01 10:13:48 crc kubenswrapper[4867]: I1201 10:13:48.282511 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:13:55 crc kubenswrapper[4867]: I1201 10:13:55.339011 4867 generic.go:334] "Generic (PLEG): container finished" podID="6c6c316b-bbb7-4e56-bced-aed519dec778" containerID="30f5150abf1848590ca82591ff68c1706e53ba7add503d560c0940861582b1ab" exitCode=0 Dec 01 10:13:55 crc kubenswrapper[4867]: I1201 10:13:55.339086 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8x9hh" event={"ID":"6c6c316b-bbb7-4e56-bced-aed519dec778","Type":"ContainerDied","Data":"30f5150abf1848590ca82591ff68c1706e53ba7add503d560c0940861582b1ab"} Dec 01 10:13:56 crc kubenswrapper[4867]: I1201 10:13:56.351318 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8x9hh" event={"ID":"6c6c316b-bbb7-4e56-bced-aed519dec778","Type":"ContainerStarted","Data":"65446723e4816a21c71168a29fab18f6db95639c1deb5087fc7426bfc5a16cfc"} Dec 01 10:13:56 crc kubenswrapper[4867]: I1201 10:13:56.606117 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8x9hh" Dec 01 10:13:56 crc kubenswrapper[4867]: I1201 10:13:56.606155 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8x9hh" Dec 01 10:13:57 crc kubenswrapper[4867]: I1201 10:13:57.653159 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-8x9hh" podUID="6c6c316b-bbb7-4e56-bced-aed519dec778" containerName="registry-server" probeResult="failure" output=< Dec 01 10:13:57 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Dec 01 10:13:57 crc kubenswrapper[4867]: > Dec 01 10:14:06 crc kubenswrapper[4867]: I1201 10:14:06.653236 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8x9hh" Dec 01 10:14:06 crc kubenswrapper[4867]: I1201 10:14:06.676763 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8x9hh" podStartSLOduration=12.973009317 podStartE2EDuration="20.676742218s" podCreationTimestamp="2025-12-01 10:13:46 +0000 UTC" firstStartedPulling="2025-12-01 10:13:48.282244797 +0000 UTC m=+3949.741631551" lastFinishedPulling="2025-12-01 10:13:55.985977698 +0000 UTC m=+3957.445364452" observedRunningTime="2025-12-01 10:13:56.385003182 +0000 UTC m=+3957.844389936" watchObservedRunningTime="2025-12-01 10:14:06.676742218 +0000 UTC m=+3968.136128972" Dec 01 10:14:06 crc kubenswrapper[4867]: I1201 10:14:06.701417 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8x9hh" Dec 01 10:14:06 crc kubenswrapper[4867]: I1201 10:14:06.792784 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8x9hh"] Dec 01 10:14:06 crc kubenswrapper[4867]: I1201 10:14:06.907664 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-22kcv"] Dec 01 10:14:06 crc kubenswrapper[4867]: I1201 10:14:06.907903 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-22kcv" podUID="b12688cd-6bde-4fda-9453-18e0238eb201" containerName="registry-server" containerID="cri-o://11e142b10d578f605c76d943d8d44b066db7f09a2e3b2aa8356a57357db80506" gracePeriod=2 Dec 01 10:14:07 crc kubenswrapper[4867]: I1201 10:14:07.477230 4867 generic.go:334] "Generic (PLEG): container finished" podID="b12688cd-6bde-4fda-9453-18e0238eb201" containerID="11e142b10d578f605c76d943d8d44b066db7f09a2e3b2aa8356a57357db80506" exitCode=0 Dec 01 10:14:07 crc kubenswrapper[4867]: I1201 10:14:07.477608 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22kcv" event={"ID":"b12688cd-6bde-4fda-9453-18e0238eb201","Type":"ContainerDied","Data":"11e142b10d578f605c76d943d8d44b066db7f09a2e3b2aa8356a57357db80506"} Dec 01 10:14:07 crc kubenswrapper[4867]: I1201 10:14:07.694865 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-22kcv" Dec 01 10:14:07 crc kubenswrapper[4867]: I1201 10:14:07.721184 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b12688cd-6bde-4fda-9453-18e0238eb201-catalog-content\") pod \"b12688cd-6bde-4fda-9453-18e0238eb201\" (UID: \"b12688cd-6bde-4fda-9453-18e0238eb201\") " Dec 01 10:14:07 crc kubenswrapper[4867]: I1201 10:14:07.721316 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7pvv\" (UniqueName: \"kubernetes.io/projected/b12688cd-6bde-4fda-9453-18e0238eb201-kube-api-access-c7pvv\") pod \"b12688cd-6bde-4fda-9453-18e0238eb201\" (UID: \"b12688cd-6bde-4fda-9453-18e0238eb201\") " Dec 01 10:14:07 crc kubenswrapper[4867]: I1201 10:14:07.721351 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b12688cd-6bde-4fda-9453-18e0238eb201-utilities\") pod \"b12688cd-6bde-4fda-9453-18e0238eb201\" (UID: \"b12688cd-6bde-4fda-9453-18e0238eb201\") " Dec 01 10:14:07 crc kubenswrapper[4867]: I1201 10:14:07.722111 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b12688cd-6bde-4fda-9453-18e0238eb201-utilities" (OuterVolumeSpecName: "utilities") pod "b12688cd-6bde-4fda-9453-18e0238eb201" (UID: "b12688cd-6bde-4fda-9453-18e0238eb201"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:14:07 crc kubenswrapper[4867]: I1201 10:14:07.744593 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b12688cd-6bde-4fda-9453-18e0238eb201-kube-api-access-c7pvv" (OuterVolumeSpecName: "kube-api-access-c7pvv") pod "b12688cd-6bde-4fda-9453-18e0238eb201" (UID: "b12688cd-6bde-4fda-9453-18e0238eb201"). InnerVolumeSpecName "kube-api-access-c7pvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:14:07 crc kubenswrapper[4867]: I1201 10:14:07.820696 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b12688cd-6bde-4fda-9453-18e0238eb201-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b12688cd-6bde-4fda-9453-18e0238eb201" (UID: "b12688cd-6bde-4fda-9453-18e0238eb201"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:14:07 crc kubenswrapper[4867]: I1201 10:14:07.823921 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b12688cd-6bde-4fda-9453-18e0238eb201-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:07 crc kubenswrapper[4867]: I1201 10:14:07.824027 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7pvv\" (UniqueName: \"kubernetes.io/projected/b12688cd-6bde-4fda-9453-18e0238eb201-kube-api-access-c7pvv\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:07 crc kubenswrapper[4867]: I1201 10:14:07.824094 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b12688cd-6bde-4fda-9453-18e0238eb201-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:08 crc kubenswrapper[4867]: I1201 10:14:08.489037 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22kcv" event={"ID":"b12688cd-6bde-4fda-9453-18e0238eb201","Type":"ContainerDied","Data":"2ead05c078b5017297f5bb1335e059d257ea668fd4f1cdfdef4f62d5750abb7a"} Dec 01 10:14:08 crc kubenswrapper[4867]: I1201 10:14:08.489418 4867 scope.go:117] "RemoveContainer" containerID="11e142b10d578f605c76d943d8d44b066db7f09a2e3b2aa8356a57357db80506" Dec 01 10:14:08 crc kubenswrapper[4867]: I1201 10:14:08.489095 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-22kcv" Dec 01 10:14:08 crc kubenswrapper[4867]: I1201 10:14:08.525719 4867 scope.go:117] "RemoveContainer" containerID="7a02d063492adc4f9abecaf9587509ccc9f296ea25d38d8c608666d5f96f8de3" Dec 01 10:14:08 crc kubenswrapper[4867]: I1201 10:14:08.553876 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-22kcv"] Dec 01 10:14:08 crc kubenswrapper[4867]: I1201 10:14:08.584000 4867 scope.go:117] "RemoveContainer" containerID="da0e86917115345a2f7d9b10b9698b90e812ce5accbfdba745e394f012598112" Dec 01 10:14:08 crc kubenswrapper[4867]: I1201 10:14:08.584759 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-22kcv"] Dec 01 10:14:08 crc kubenswrapper[4867]: I1201 10:14:08.836954 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b12688cd-6bde-4fda-9453-18e0238eb201" path="/var/lib/kubelet/pods/b12688cd-6bde-4fda-9453-18e0238eb201/volumes" Dec 01 10:14:44 crc kubenswrapper[4867]: I1201 10:14:44.035693 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lz2t2"] Dec 01 10:14:44 crc kubenswrapper[4867]: E1201 10:14:44.037447 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12688cd-6bde-4fda-9453-18e0238eb201" containerName="registry-server" Dec 01 10:14:44 crc kubenswrapper[4867]: I1201 10:14:44.037524 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12688cd-6bde-4fda-9453-18e0238eb201" containerName="registry-server" Dec 01 10:14:44 crc kubenswrapper[4867]: E1201 10:14:44.037596 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12688cd-6bde-4fda-9453-18e0238eb201" containerName="extract-content" Dec 01 10:14:44 crc kubenswrapper[4867]: I1201 10:14:44.037648 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12688cd-6bde-4fda-9453-18e0238eb201" containerName="extract-content" Dec 01 10:14:44 crc kubenswrapper[4867]: E1201 10:14:44.037707 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12688cd-6bde-4fda-9453-18e0238eb201" containerName="extract-utilities" Dec 01 10:14:44 crc kubenswrapper[4867]: I1201 10:14:44.037758 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12688cd-6bde-4fda-9453-18e0238eb201" containerName="extract-utilities" Dec 01 10:14:44 crc kubenswrapper[4867]: I1201 10:14:44.038021 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b12688cd-6bde-4fda-9453-18e0238eb201" containerName="registry-server" Dec 01 10:14:44 crc kubenswrapper[4867]: I1201 10:14:44.039463 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lz2t2" Dec 01 10:14:44 crc kubenswrapper[4867]: I1201 10:14:44.049382 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lz2t2"] Dec 01 10:14:44 crc kubenswrapper[4867]: I1201 10:14:44.188338 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/549fd56c-ed44-49b3-8379-9f5f5e138f9a-utilities\") pod \"redhat-marketplace-lz2t2\" (UID: \"549fd56c-ed44-49b3-8379-9f5f5e138f9a\") " pod="openshift-marketplace/redhat-marketplace-lz2t2" Dec 01 10:14:44 crc kubenswrapper[4867]: I1201 10:14:44.188428 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8kqs\" (UniqueName: \"kubernetes.io/projected/549fd56c-ed44-49b3-8379-9f5f5e138f9a-kube-api-access-x8kqs\") pod \"redhat-marketplace-lz2t2\" (UID: \"549fd56c-ed44-49b3-8379-9f5f5e138f9a\") " pod="openshift-marketplace/redhat-marketplace-lz2t2" Dec 01 10:14:44 crc kubenswrapper[4867]: I1201 10:14:44.188487 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/549fd56c-ed44-49b3-8379-9f5f5e138f9a-catalog-content\") pod \"redhat-marketplace-lz2t2\" (UID: \"549fd56c-ed44-49b3-8379-9f5f5e138f9a\") " pod="openshift-marketplace/redhat-marketplace-lz2t2" Dec 01 10:14:44 crc kubenswrapper[4867]: I1201 10:14:44.290282 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/549fd56c-ed44-49b3-8379-9f5f5e138f9a-utilities\") pod \"redhat-marketplace-lz2t2\" (UID: \"549fd56c-ed44-49b3-8379-9f5f5e138f9a\") " pod="openshift-marketplace/redhat-marketplace-lz2t2" Dec 01 10:14:44 crc kubenswrapper[4867]: I1201 10:14:44.290349 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8kqs\" (UniqueName: \"kubernetes.io/projected/549fd56c-ed44-49b3-8379-9f5f5e138f9a-kube-api-access-x8kqs\") pod \"redhat-marketplace-lz2t2\" (UID: \"549fd56c-ed44-49b3-8379-9f5f5e138f9a\") " pod="openshift-marketplace/redhat-marketplace-lz2t2" Dec 01 10:14:44 crc kubenswrapper[4867]: I1201 10:14:44.290388 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/549fd56c-ed44-49b3-8379-9f5f5e138f9a-catalog-content\") pod \"redhat-marketplace-lz2t2\" (UID: \"549fd56c-ed44-49b3-8379-9f5f5e138f9a\") " pod="openshift-marketplace/redhat-marketplace-lz2t2" Dec 01 10:14:44 crc kubenswrapper[4867]: I1201 10:14:44.290889 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/549fd56c-ed44-49b3-8379-9f5f5e138f9a-catalog-content\") pod \"redhat-marketplace-lz2t2\" (UID: \"549fd56c-ed44-49b3-8379-9f5f5e138f9a\") " pod="openshift-marketplace/redhat-marketplace-lz2t2" Dec 01 10:14:44 crc kubenswrapper[4867]: I1201 10:14:44.291187 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/549fd56c-ed44-49b3-8379-9f5f5e138f9a-utilities\") pod \"redhat-marketplace-lz2t2\" (UID: \"549fd56c-ed44-49b3-8379-9f5f5e138f9a\") " pod="openshift-marketplace/redhat-marketplace-lz2t2" Dec 01 10:14:44 crc kubenswrapper[4867]: I1201 10:14:44.606402 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8kqs\" (UniqueName: \"kubernetes.io/projected/549fd56c-ed44-49b3-8379-9f5f5e138f9a-kube-api-access-x8kqs\") pod \"redhat-marketplace-lz2t2\" (UID: \"549fd56c-ed44-49b3-8379-9f5f5e138f9a\") " pod="openshift-marketplace/redhat-marketplace-lz2t2" Dec 01 10:14:44 crc kubenswrapper[4867]: I1201 10:14:44.658688 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lz2t2" Dec 01 10:14:45 crc kubenswrapper[4867]: I1201 10:14:45.155476 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lz2t2"] Dec 01 10:14:45 crc kubenswrapper[4867]: I1201 10:14:45.813982 4867 generic.go:334] "Generic (PLEG): container finished" podID="549fd56c-ed44-49b3-8379-9f5f5e138f9a" containerID="7a2c0ac37e01c112cd2bdb5e504eb27733f35bdfdd7a2645b029483a54e056f9" exitCode=0 Dec 01 10:14:45 crc kubenswrapper[4867]: I1201 10:14:45.814192 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lz2t2" event={"ID":"549fd56c-ed44-49b3-8379-9f5f5e138f9a","Type":"ContainerDied","Data":"7a2c0ac37e01c112cd2bdb5e504eb27733f35bdfdd7a2645b029483a54e056f9"} Dec 01 10:14:45 crc kubenswrapper[4867]: I1201 10:14:45.814274 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lz2t2" event={"ID":"549fd56c-ed44-49b3-8379-9f5f5e138f9a","Type":"ContainerStarted","Data":"fb30472cfc454cb9e5940b1ec6bda25af1f916f3f0514de312fc0874ad246a43"} Dec 01 10:14:46 crc kubenswrapper[4867]: I1201 10:14:46.433779 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fw4xj"] Dec 01 10:14:46 crc kubenswrapper[4867]: I1201 10:14:46.438195 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fw4xj" Dec 01 10:14:46 crc kubenswrapper[4867]: I1201 10:14:46.462681 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fw4xj"] Dec 01 10:14:46 crc kubenswrapper[4867]: I1201 10:14:46.533712 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b107ca4-8921-495d-a9d4-4539d1b07505-utilities\") pod \"certified-operators-fw4xj\" (UID: \"8b107ca4-8921-495d-a9d4-4539d1b07505\") " pod="openshift-marketplace/certified-operators-fw4xj" Dec 01 10:14:46 crc kubenswrapper[4867]: I1201 10:14:46.534057 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b107ca4-8921-495d-a9d4-4539d1b07505-catalog-content\") pod \"certified-operators-fw4xj\" (UID: \"8b107ca4-8921-495d-a9d4-4539d1b07505\") " pod="openshift-marketplace/certified-operators-fw4xj" Dec 01 10:14:46 crc kubenswrapper[4867]: I1201 10:14:46.534413 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkbs8\" (UniqueName: \"kubernetes.io/projected/8b107ca4-8921-495d-a9d4-4539d1b07505-kube-api-access-dkbs8\") pod \"certified-operators-fw4xj\" (UID: \"8b107ca4-8921-495d-a9d4-4539d1b07505\") " pod="openshift-marketplace/certified-operators-fw4xj" Dec 01 10:14:46 crc kubenswrapper[4867]: I1201 10:14:46.636475 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b107ca4-8921-495d-a9d4-4539d1b07505-catalog-content\") pod \"certified-operators-fw4xj\" (UID: \"8b107ca4-8921-495d-a9d4-4539d1b07505\") " pod="openshift-marketplace/certified-operators-fw4xj" Dec 01 10:14:46 crc kubenswrapper[4867]: I1201 10:14:46.636604 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkbs8\" (UniqueName: \"kubernetes.io/projected/8b107ca4-8921-495d-a9d4-4539d1b07505-kube-api-access-dkbs8\") pod \"certified-operators-fw4xj\" (UID: \"8b107ca4-8921-495d-a9d4-4539d1b07505\") " pod="openshift-marketplace/certified-operators-fw4xj" Dec 01 10:14:46 crc kubenswrapper[4867]: I1201 10:14:46.636662 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b107ca4-8921-495d-a9d4-4539d1b07505-utilities\") pod \"certified-operators-fw4xj\" (UID: \"8b107ca4-8921-495d-a9d4-4539d1b07505\") " pod="openshift-marketplace/certified-operators-fw4xj" Dec 01 10:14:46 crc kubenswrapper[4867]: I1201 10:14:46.637009 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b107ca4-8921-495d-a9d4-4539d1b07505-catalog-content\") pod \"certified-operators-fw4xj\" (UID: \"8b107ca4-8921-495d-a9d4-4539d1b07505\") " pod="openshift-marketplace/certified-operators-fw4xj" Dec 01 10:14:46 crc kubenswrapper[4867]: I1201 10:14:46.637060 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b107ca4-8921-495d-a9d4-4539d1b07505-utilities\") pod \"certified-operators-fw4xj\" (UID: \"8b107ca4-8921-495d-a9d4-4539d1b07505\") " pod="openshift-marketplace/certified-operators-fw4xj" Dec 01 10:14:46 crc kubenswrapper[4867]: I1201 10:14:46.672886 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkbs8\" (UniqueName: \"kubernetes.io/projected/8b107ca4-8921-495d-a9d4-4539d1b07505-kube-api-access-dkbs8\") pod \"certified-operators-fw4xj\" (UID: \"8b107ca4-8921-495d-a9d4-4539d1b07505\") " pod="openshift-marketplace/certified-operators-fw4xj" Dec 01 10:14:46 crc kubenswrapper[4867]: I1201 10:14:46.756664 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fw4xj" Dec 01 10:14:47 crc kubenswrapper[4867]: I1201 10:14:47.388253 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fw4xj"] Dec 01 10:14:47 crc kubenswrapper[4867]: I1201 10:14:47.837323 4867 generic.go:334] "Generic (PLEG): container finished" podID="549fd56c-ed44-49b3-8379-9f5f5e138f9a" containerID="471d94d747e5fb0a09156d782e1e7d8c007aa3e60c22c6d6412c80dd1634853a" exitCode=0 Dec 01 10:14:47 crc kubenswrapper[4867]: I1201 10:14:47.837517 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lz2t2" event={"ID":"549fd56c-ed44-49b3-8379-9f5f5e138f9a","Type":"ContainerDied","Data":"471d94d747e5fb0a09156d782e1e7d8c007aa3e60c22c6d6412c80dd1634853a"} Dec 01 10:14:47 crc kubenswrapper[4867]: I1201 10:14:47.840881 4867 generic.go:334] "Generic (PLEG): container finished" podID="8b107ca4-8921-495d-a9d4-4539d1b07505" containerID="0c431dffff4ed0b67afb24c39bfbf04f24407065d13b7b907971de2735d6954d" exitCode=0 Dec 01 10:14:47 crc kubenswrapper[4867]: I1201 10:14:47.840916 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fw4xj" event={"ID":"8b107ca4-8921-495d-a9d4-4539d1b07505","Type":"ContainerDied","Data":"0c431dffff4ed0b67afb24c39bfbf04f24407065d13b7b907971de2735d6954d"} Dec 01 10:14:47 crc kubenswrapper[4867]: I1201 10:14:47.840938 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fw4xj" event={"ID":"8b107ca4-8921-495d-a9d4-4539d1b07505","Type":"ContainerStarted","Data":"c83a503a7ab54e5cd2530ab5c9a1f9d4e4b68cfb96b76461e3b9583cc64d10e9"} Dec 01 10:14:48 crc kubenswrapper[4867]: I1201 10:14:48.856515 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fw4xj" event={"ID":"8b107ca4-8921-495d-a9d4-4539d1b07505","Type":"ContainerStarted","Data":"b3b517411988f4e7a61c0534e1090982dbbb06230bc82aa1a0b8d6492901238e"} Dec 01 10:14:48 crc kubenswrapper[4867]: I1201 10:14:48.864992 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lz2t2" event={"ID":"549fd56c-ed44-49b3-8379-9f5f5e138f9a","Type":"ContainerStarted","Data":"305af919bb139b509cfd5f16bffdfe7a7bb0d654dd37a1cb6c83634ac7d91aee"} Dec 01 10:14:49 crc kubenswrapper[4867]: I1201 10:14:49.874867 4867 generic.go:334] "Generic (PLEG): container finished" podID="8b107ca4-8921-495d-a9d4-4539d1b07505" containerID="b3b517411988f4e7a61c0534e1090982dbbb06230bc82aa1a0b8d6492901238e" exitCode=0 Dec 01 10:14:49 crc kubenswrapper[4867]: I1201 10:14:49.875049 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fw4xj" event={"ID":"8b107ca4-8921-495d-a9d4-4539d1b07505","Type":"ContainerDied","Data":"b3b517411988f4e7a61c0534e1090982dbbb06230bc82aa1a0b8d6492901238e"} Dec 01 10:14:49 crc kubenswrapper[4867]: I1201 10:14:49.905385 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lz2t2" podStartSLOduration=3.445307925 podStartE2EDuration="5.90536184s" podCreationTimestamp="2025-12-01 10:14:44 +0000 UTC" firstStartedPulling="2025-12-01 10:14:45.817306681 +0000 UTC m=+4007.276693435" lastFinishedPulling="2025-12-01 10:14:48.277360596 +0000 UTC m=+4009.736747350" observedRunningTime="2025-12-01 10:14:48.912300387 +0000 UTC m=+4010.371687141" watchObservedRunningTime="2025-12-01 10:14:49.90536184 +0000 UTC m=+4011.364748594" Dec 01 10:14:50 crc kubenswrapper[4867]: I1201 10:14:50.887419 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fw4xj" event={"ID":"8b107ca4-8921-495d-a9d4-4539d1b07505","Type":"ContainerStarted","Data":"4dd1d48a539d18b53e30da8384b21678968ba10684b1c7b8284ef396f39aa175"} Dec 01 10:14:50 crc kubenswrapper[4867]: I1201 10:14:50.909150 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fw4xj" podStartSLOduration=2.356773423 podStartE2EDuration="4.909128678s" podCreationTimestamp="2025-12-01 10:14:46 +0000 UTC" firstStartedPulling="2025-12-01 10:14:47.842329595 +0000 UTC m=+4009.301716349" lastFinishedPulling="2025-12-01 10:14:50.39468485 +0000 UTC m=+4011.854071604" observedRunningTime="2025-12-01 10:14:50.906060514 +0000 UTC m=+4012.365447258" watchObservedRunningTime="2025-12-01 10:14:50.909128678 +0000 UTC m=+4012.368515432" Dec 01 10:14:54 crc kubenswrapper[4867]: I1201 10:14:54.659333 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lz2t2" Dec 01 10:14:54 crc kubenswrapper[4867]: I1201 10:14:54.659928 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lz2t2" Dec 01 10:14:54 crc kubenswrapper[4867]: I1201 10:14:54.711644 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lz2t2" Dec 01 10:14:54 crc kubenswrapper[4867]: I1201 10:14:54.960536 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lz2t2" Dec 01 10:14:56 crc kubenswrapper[4867]: I1201 10:14:56.028841 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lz2t2"] Dec 01 10:14:56 crc kubenswrapper[4867]: I1201 10:14:56.759062 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fw4xj" Dec 01 10:14:56 crc kubenswrapper[4867]: I1201 10:14:56.759302 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fw4xj" Dec 01 10:14:56 crc kubenswrapper[4867]: I1201 10:14:56.848259 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fw4xj" Dec 01 10:14:56 crc kubenswrapper[4867]: I1201 10:14:56.932297 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lz2t2" podUID="549fd56c-ed44-49b3-8379-9f5f5e138f9a" containerName="registry-server" containerID="cri-o://305af919bb139b509cfd5f16bffdfe7a7bb0d654dd37a1cb6c83634ac7d91aee" gracePeriod=2 Dec 01 10:14:57 crc kubenswrapper[4867]: I1201 10:14:57.053196 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fw4xj" Dec 01 10:14:57 crc kubenswrapper[4867]: I1201 10:14:57.569260 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lz2t2" Dec 01 10:14:57 crc kubenswrapper[4867]: I1201 10:14:57.709774 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/549fd56c-ed44-49b3-8379-9f5f5e138f9a-catalog-content\") pod \"549fd56c-ed44-49b3-8379-9f5f5e138f9a\" (UID: \"549fd56c-ed44-49b3-8379-9f5f5e138f9a\") " Dec 01 10:14:57 crc kubenswrapper[4867]: I1201 10:14:57.710233 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8kqs\" (UniqueName: \"kubernetes.io/projected/549fd56c-ed44-49b3-8379-9f5f5e138f9a-kube-api-access-x8kqs\") pod \"549fd56c-ed44-49b3-8379-9f5f5e138f9a\" (UID: \"549fd56c-ed44-49b3-8379-9f5f5e138f9a\") " Dec 01 10:14:57 crc kubenswrapper[4867]: I1201 10:14:57.710323 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/549fd56c-ed44-49b3-8379-9f5f5e138f9a-utilities\") pod \"549fd56c-ed44-49b3-8379-9f5f5e138f9a\" (UID: \"549fd56c-ed44-49b3-8379-9f5f5e138f9a\") " Dec 01 10:14:57 crc kubenswrapper[4867]: I1201 10:14:57.711080 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/549fd56c-ed44-49b3-8379-9f5f5e138f9a-utilities" (OuterVolumeSpecName: "utilities") pod "549fd56c-ed44-49b3-8379-9f5f5e138f9a" (UID: "549fd56c-ed44-49b3-8379-9f5f5e138f9a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:14:57 crc kubenswrapper[4867]: I1201 10:14:57.718254 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/549fd56c-ed44-49b3-8379-9f5f5e138f9a-kube-api-access-x8kqs" (OuterVolumeSpecName: "kube-api-access-x8kqs") pod "549fd56c-ed44-49b3-8379-9f5f5e138f9a" (UID: "549fd56c-ed44-49b3-8379-9f5f5e138f9a"). InnerVolumeSpecName "kube-api-access-x8kqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:14:57 crc kubenswrapper[4867]: I1201 10:14:57.734284 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/549fd56c-ed44-49b3-8379-9f5f5e138f9a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "549fd56c-ed44-49b3-8379-9f5f5e138f9a" (UID: "549fd56c-ed44-49b3-8379-9f5f5e138f9a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:14:57 crc kubenswrapper[4867]: I1201 10:14:57.813104 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/549fd56c-ed44-49b3-8379-9f5f5e138f9a-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:57 crc kubenswrapper[4867]: I1201 10:14:57.813132 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/549fd56c-ed44-49b3-8379-9f5f5e138f9a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:57 crc kubenswrapper[4867]: I1201 10:14:57.813144 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8kqs\" (UniqueName: \"kubernetes.io/projected/549fd56c-ed44-49b3-8379-9f5f5e138f9a-kube-api-access-x8kqs\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:57 crc kubenswrapper[4867]: I1201 10:14:57.944989 4867 generic.go:334] "Generic (PLEG): container finished" podID="549fd56c-ed44-49b3-8379-9f5f5e138f9a" containerID="305af919bb139b509cfd5f16bffdfe7a7bb0d654dd37a1cb6c83634ac7d91aee" exitCode=0 Dec 01 10:14:57 crc kubenswrapper[4867]: I1201 10:14:57.945098 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lz2t2" Dec 01 10:14:57 crc kubenswrapper[4867]: I1201 10:14:57.945129 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lz2t2" event={"ID":"549fd56c-ed44-49b3-8379-9f5f5e138f9a","Type":"ContainerDied","Data":"305af919bb139b509cfd5f16bffdfe7a7bb0d654dd37a1cb6c83634ac7d91aee"} Dec 01 10:14:57 crc kubenswrapper[4867]: I1201 10:14:57.945171 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lz2t2" event={"ID":"549fd56c-ed44-49b3-8379-9f5f5e138f9a","Type":"ContainerDied","Data":"fb30472cfc454cb9e5940b1ec6bda25af1f916f3f0514de312fc0874ad246a43"} Dec 01 10:14:57 crc kubenswrapper[4867]: I1201 10:14:57.945190 4867 scope.go:117] "RemoveContainer" containerID="305af919bb139b509cfd5f16bffdfe7a7bb0d654dd37a1cb6c83634ac7d91aee" Dec 01 10:14:57 crc kubenswrapper[4867]: I1201 10:14:57.989590 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lz2t2"] Dec 01 10:14:58 crc kubenswrapper[4867]: I1201 10:14:58.002248 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lz2t2"] Dec 01 10:14:58 crc kubenswrapper[4867]: I1201 10:14:58.005926 4867 scope.go:117] "RemoveContainer" containerID="471d94d747e5fb0a09156d782e1e7d8c007aa3e60c22c6d6412c80dd1634853a" Dec 01 10:14:58 crc kubenswrapper[4867]: I1201 10:14:58.030590 4867 scope.go:117] "RemoveContainer" containerID="7a2c0ac37e01c112cd2bdb5e504eb27733f35bdfdd7a2645b029483a54e056f9" Dec 01 10:14:58 crc kubenswrapper[4867]: I1201 10:14:58.077514 4867 scope.go:117] "RemoveContainer" containerID="305af919bb139b509cfd5f16bffdfe7a7bb0d654dd37a1cb6c83634ac7d91aee" Dec 01 10:14:58 crc kubenswrapper[4867]: E1201 10:14:58.078250 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"305af919bb139b509cfd5f16bffdfe7a7bb0d654dd37a1cb6c83634ac7d91aee\": container with ID starting with 305af919bb139b509cfd5f16bffdfe7a7bb0d654dd37a1cb6c83634ac7d91aee not found: ID does not exist" containerID="305af919bb139b509cfd5f16bffdfe7a7bb0d654dd37a1cb6c83634ac7d91aee" Dec 01 10:14:58 crc kubenswrapper[4867]: I1201 10:14:58.078297 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"305af919bb139b509cfd5f16bffdfe7a7bb0d654dd37a1cb6c83634ac7d91aee"} err="failed to get container status \"305af919bb139b509cfd5f16bffdfe7a7bb0d654dd37a1cb6c83634ac7d91aee\": rpc error: code = NotFound desc = could not find container \"305af919bb139b509cfd5f16bffdfe7a7bb0d654dd37a1cb6c83634ac7d91aee\": container with ID starting with 305af919bb139b509cfd5f16bffdfe7a7bb0d654dd37a1cb6c83634ac7d91aee not found: ID does not exist" Dec 01 10:14:58 crc kubenswrapper[4867]: I1201 10:14:58.078327 4867 scope.go:117] "RemoveContainer" containerID="471d94d747e5fb0a09156d782e1e7d8c007aa3e60c22c6d6412c80dd1634853a" Dec 01 10:14:58 crc kubenswrapper[4867]: E1201 10:14:58.079122 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"471d94d747e5fb0a09156d782e1e7d8c007aa3e60c22c6d6412c80dd1634853a\": container with ID starting with 471d94d747e5fb0a09156d782e1e7d8c007aa3e60c22c6d6412c80dd1634853a not found: ID does not exist" containerID="471d94d747e5fb0a09156d782e1e7d8c007aa3e60c22c6d6412c80dd1634853a" Dec 01 10:14:58 crc kubenswrapper[4867]: I1201 10:14:58.079156 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"471d94d747e5fb0a09156d782e1e7d8c007aa3e60c22c6d6412c80dd1634853a"} err="failed to get container status \"471d94d747e5fb0a09156d782e1e7d8c007aa3e60c22c6d6412c80dd1634853a\": rpc error: code = NotFound desc = could not find container \"471d94d747e5fb0a09156d782e1e7d8c007aa3e60c22c6d6412c80dd1634853a\": container with ID starting with 471d94d747e5fb0a09156d782e1e7d8c007aa3e60c22c6d6412c80dd1634853a not found: ID does not exist" Dec 01 10:14:58 crc kubenswrapper[4867]: I1201 10:14:58.079177 4867 scope.go:117] "RemoveContainer" containerID="7a2c0ac37e01c112cd2bdb5e504eb27733f35bdfdd7a2645b029483a54e056f9" Dec 01 10:14:58 crc kubenswrapper[4867]: E1201 10:14:58.079403 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a2c0ac37e01c112cd2bdb5e504eb27733f35bdfdd7a2645b029483a54e056f9\": container with ID starting with 7a2c0ac37e01c112cd2bdb5e504eb27733f35bdfdd7a2645b029483a54e056f9 not found: ID does not exist" containerID="7a2c0ac37e01c112cd2bdb5e504eb27733f35bdfdd7a2645b029483a54e056f9" Dec 01 10:14:58 crc kubenswrapper[4867]: I1201 10:14:58.079428 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a2c0ac37e01c112cd2bdb5e504eb27733f35bdfdd7a2645b029483a54e056f9"} err="failed to get container status \"7a2c0ac37e01c112cd2bdb5e504eb27733f35bdfdd7a2645b029483a54e056f9\": rpc error: code = NotFound desc = could not find container \"7a2c0ac37e01c112cd2bdb5e504eb27733f35bdfdd7a2645b029483a54e056f9\": container with ID starting with 7a2c0ac37e01c112cd2bdb5e504eb27733f35bdfdd7a2645b029483a54e056f9 not found: ID does not exist" Dec 01 10:14:58 crc kubenswrapper[4867]: I1201 10:14:58.840326 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="549fd56c-ed44-49b3-8379-9f5f5e138f9a" path="/var/lib/kubelet/pods/549fd56c-ed44-49b3-8379-9f5f5e138f9a/volumes" Dec 01 10:14:59 crc kubenswrapper[4867]: I1201 10:14:59.225939 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fw4xj"] Dec 01 10:14:59 crc kubenswrapper[4867]: I1201 10:14:59.966088 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fw4xj" podUID="8b107ca4-8921-495d-a9d4-4539d1b07505" containerName="registry-server" containerID="cri-o://4dd1d48a539d18b53e30da8384b21678968ba10684b1c7b8284ef396f39aa175" gracePeriod=2 Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.224638 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409735-z5lgz"] Dec 01 10:15:00 crc kubenswrapper[4867]: E1201 10:15:00.225371 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="549fd56c-ed44-49b3-8379-9f5f5e138f9a" containerName="extract-utilities" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.225383 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="549fd56c-ed44-49b3-8379-9f5f5e138f9a" containerName="extract-utilities" Dec 01 10:15:00 crc kubenswrapper[4867]: E1201 10:15:00.225407 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="549fd56c-ed44-49b3-8379-9f5f5e138f9a" containerName="registry-server" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.225413 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="549fd56c-ed44-49b3-8379-9f5f5e138f9a" containerName="registry-server" Dec 01 10:15:00 crc kubenswrapper[4867]: E1201 10:15:00.225435 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="549fd56c-ed44-49b3-8379-9f5f5e138f9a" containerName="extract-content" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.225441 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="549fd56c-ed44-49b3-8379-9f5f5e138f9a" containerName="extract-content" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.226108 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="549fd56c-ed44-49b3-8379-9f5f5e138f9a" containerName="registry-server" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.226957 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-z5lgz" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.234760 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409735-z5lgz"] Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.236503 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.236967 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.361947 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28a56290-ba72-4944-be33-f3b057926eb7-config-volume\") pod \"collect-profiles-29409735-z5lgz\" (UID: \"28a56290-ba72-4944-be33-f3b057926eb7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-z5lgz" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.361996 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28a56290-ba72-4944-be33-f3b057926eb7-secret-volume\") pod \"collect-profiles-29409735-z5lgz\" (UID: \"28a56290-ba72-4944-be33-f3b057926eb7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-z5lgz" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.362038 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4nxn\" (UniqueName: \"kubernetes.io/projected/28a56290-ba72-4944-be33-f3b057926eb7-kube-api-access-m4nxn\") pod \"collect-profiles-29409735-z5lgz\" (UID: \"28a56290-ba72-4944-be33-f3b057926eb7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-z5lgz" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.463931 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28a56290-ba72-4944-be33-f3b057926eb7-config-volume\") pod \"collect-profiles-29409735-z5lgz\" (UID: \"28a56290-ba72-4944-be33-f3b057926eb7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-z5lgz" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.463971 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28a56290-ba72-4944-be33-f3b057926eb7-secret-volume\") pod \"collect-profiles-29409735-z5lgz\" (UID: \"28a56290-ba72-4944-be33-f3b057926eb7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-z5lgz" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.463998 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4nxn\" (UniqueName: \"kubernetes.io/projected/28a56290-ba72-4944-be33-f3b057926eb7-kube-api-access-m4nxn\") pod \"collect-profiles-29409735-z5lgz\" (UID: \"28a56290-ba72-4944-be33-f3b057926eb7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-z5lgz" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.465162 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28a56290-ba72-4944-be33-f3b057926eb7-config-volume\") pod \"collect-profiles-29409735-z5lgz\" (UID: \"28a56290-ba72-4944-be33-f3b057926eb7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-z5lgz" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.474387 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28a56290-ba72-4944-be33-f3b057926eb7-secret-volume\") pod \"collect-profiles-29409735-z5lgz\" (UID: \"28a56290-ba72-4944-be33-f3b057926eb7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-z5lgz" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.485018 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4nxn\" (UniqueName: \"kubernetes.io/projected/28a56290-ba72-4944-be33-f3b057926eb7-kube-api-access-m4nxn\") pod \"collect-profiles-29409735-z5lgz\" (UID: \"28a56290-ba72-4944-be33-f3b057926eb7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-z5lgz" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.551581 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-z5lgz" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.684930 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fw4xj" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.771637 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkbs8\" (UniqueName: \"kubernetes.io/projected/8b107ca4-8921-495d-a9d4-4539d1b07505-kube-api-access-dkbs8\") pod \"8b107ca4-8921-495d-a9d4-4539d1b07505\" (UID: \"8b107ca4-8921-495d-a9d4-4539d1b07505\") " Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.771776 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b107ca4-8921-495d-a9d4-4539d1b07505-utilities\") pod \"8b107ca4-8921-495d-a9d4-4539d1b07505\" (UID: \"8b107ca4-8921-495d-a9d4-4539d1b07505\") " Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.771808 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b107ca4-8921-495d-a9d4-4539d1b07505-catalog-content\") pod \"8b107ca4-8921-495d-a9d4-4539d1b07505\" (UID: \"8b107ca4-8921-495d-a9d4-4539d1b07505\") " Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.773038 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b107ca4-8921-495d-a9d4-4539d1b07505-utilities" (OuterVolumeSpecName: "utilities") pod "8b107ca4-8921-495d-a9d4-4539d1b07505" (UID: "8b107ca4-8921-495d-a9d4-4539d1b07505"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.776503 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b107ca4-8921-495d-a9d4-4539d1b07505-kube-api-access-dkbs8" (OuterVolumeSpecName: "kube-api-access-dkbs8") pod "8b107ca4-8921-495d-a9d4-4539d1b07505" (UID: "8b107ca4-8921-495d-a9d4-4539d1b07505"). InnerVolumeSpecName "kube-api-access-dkbs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.883608 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b107ca4-8921-495d-a9d4-4539d1b07505-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b107ca4-8921-495d-a9d4-4539d1b07505" (UID: "8b107ca4-8921-495d-a9d4-4539d1b07505"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.885892 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkbs8\" (UniqueName: \"kubernetes.io/projected/8b107ca4-8921-495d-a9d4-4539d1b07505-kube-api-access-dkbs8\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.885939 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b107ca4-8921-495d-a9d4-4539d1b07505-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.885952 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b107ca4-8921-495d-a9d4-4539d1b07505-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.976274 4867 generic.go:334] "Generic (PLEG): container finished" podID="8b107ca4-8921-495d-a9d4-4539d1b07505" containerID="4dd1d48a539d18b53e30da8384b21678968ba10684b1c7b8284ef396f39aa175" exitCode=0 Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.976324 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fw4xj" event={"ID":"8b107ca4-8921-495d-a9d4-4539d1b07505","Type":"ContainerDied","Data":"4dd1d48a539d18b53e30da8384b21678968ba10684b1c7b8284ef396f39aa175"} Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.976356 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fw4xj" event={"ID":"8b107ca4-8921-495d-a9d4-4539d1b07505","Type":"ContainerDied","Data":"c83a503a7ab54e5cd2530ab5c9a1f9d4e4b68cfb96b76461e3b9583cc64d10e9"} Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.976379 4867 scope.go:117] "RemoveContainer" containerID="4dd1d48a539d18b53e30da8384b21678968ba10684b1c7b8284ef396f39aa175" Dec 01 10:15:00 crc kubenswrapper[4867]: I1201 10:15:00.976568 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fw4xj" Dec 01 10:15:01 crc kubenswrapper[4867]: I1201 10:15:01.019766 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fw4xj"] Dec 01 10:15:01 crc kubenswrapper[4867]: I1201 10:15:01.019789 4867 scope.go:117] "RemoveContainer" containerID="b3b517411988f4e7a61c0534e1090982dbbb06230bc82aa1a0b8d6492901238e" Dec 01 10:15:01 crc kubenswrapper[4867]: I1201 10:15:01.033218 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fw4xj"] Dec 01 10:15:01 crc kubenswrapper[4867]: I1201 10:15:01.064837 4867 scope.go:117] "RemoveContainer" containerID="0c431dffff4ed0b67afb24c39bfbf04f24407065d13b7b907971de2735d6954d" Dec 01 10:15:01 crc kubenswrapper[4867]: I1201 10:15:01.104412 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409735-z5lgz"] Dec 01 10:15:01 crc kubenswrapper[4867]: W1201 10:15:01.110996 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28a56290_ba72_4944_be33_f3b057926eb7.slice/crio-0b9a58c8210aacd7a43036b84c387892cb92a1e5ab1ba9d2a32aa695ea4c864f WatchSource:0}: Error finding container 0b9a58c8210aacd7a43036b84c387892cb92a1e5ab1ba9d2a32aa695ea4c864f: Status 404 returned error can't find the container with id 0b9a58c8210aacd7a43036b84c387892cb92a1e5ab1ba9d2a32aa695ea4c864f Dec 01 10:15:01 crc kubenswrapper[4867]: I1201 10:15:01.118213 4867 scope.go:117] "RemoveContainer" containerID="4dd1d48a539d18b53e30da8384b21678968ba10684b1c7b8284ef396f39aa175" Dec 01 10:15:01 crc kubenswrapper[4867]: E1201 10:15:01.118875 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dd1d48a539d18b53e30da8384b21678968ba10684b1c7b8284ef396f39aa175\": container with ID starting with 4dd1d48a539d18b53e30da8384b21678968ba10684b1c7b8284ef396f39aa175 not found: ID does not exist" containerID="4dd1d48a539d18b53e30da8384b21678968ba10684b1c7b8284ef396f39aa175" Dec 01 10:15:01 crc kubenswrapper[4867]: I1201 10:15:01.118915 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dd1d48a539d18b53e30da8384b21678968ba10684b1c7b8284ef396f39aa175"} err="failed to get container status \"4dd1d48a539d18b53e30da8384b21678968ba10684b1c7b8284ef396f39aa175\": rpc error: code = NotFound desc = could not find container \"4dd1d48a539d18b53e30da8384b21678968ba10684b1c7b8284ef396f39aa175\": container with ID starting with 4dd1d48a539d18b53e30da8384b21678968ba10684b1c7b8284ef396f39aa175 not found: ID does not exist" Dec 01 10:15:01 crc kubenswrapper[4867]: I1201 10:15:01.118941 4867 scope.go:117] "RemoveContainer" containerID="b3b517411988f4e7a61c0534e1090982dbbb06230bc82aa1a0b8d6492901238e" Dec 01 10:15:01 crc kubenswrapper[4867]: E1201 10:15:01.119238 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3b517411988f4e7a61c0534e1090982dbbb06230bc82aa1a0b8d6492901238e\": container with ID starting with b3b517411988f4e7a61c0534e1090982dbbb06230bc82aa1a0b8d6492901238e not found: ID does not exist" containerID="b3b517411988f4e7a61c0534e1090982dbbb06230bc82aa1a0b8d6492901238e" Dec 01 10:15:01 crc kubenswrapper[4867]: I1201 10:15:01.119270 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3b517411988f4e7a61c0534e1090982dbbb06230bc82aa1a0b8d6492901238e"} err="failed to get container status \"b3b517411988f4e7a61c0534e1090982dbbb06230bc82aa1a0b8d6492901238e\": rpc error: code = NotFound desc = could not find container \"b3b517411988f4e7a61c0534e1090982dbbb06230bc82aa1a0b8d6492901238e\": container with ID starting with b3b517411988f4e7a61c0534e1090982dbbb06230bc82aa1a0b8d6492901238e not found: ID does not exist" Dec 01 10:15:01 crc kubenswrapper[4867]: I1201 10:15:01.119289 4867 scope.go:117] "RemoveContainer" containerID="0c431dffff4ed0b67afb24c39bfbf04f24407065d13b7b907971de2735d6954d" Dec 01 10:15:01 crc kubenswrapper[4867]: E1201 10:15:01.119668 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c431dffff4ed0b67afb24c39bfbf04f24407065d13b7b907971de2735d6954d\": container with ID starting with 0c431dffff4ed0b67afb24c39bfbf04f24407065d13b7b907971de2735d6954d not found: ID does not exist" containerID="0c431dffff4ed0b67afb24c39bfbf04f24407065d13b7b907971de2735d6954d" Dec 01 10:15:01 crc kubenswrapper[4867]: I1201 10:15:01.119714 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c431dffff4ed0b67afb24c39bfbf04f24407065d13b7b907971de2735d6954d"} err="failed to get container status \"0c431dffff4ed0b67afb24c39bfbf04f24407065d13b7b907971de2735d6954d\": rpc error: code = NotFound desc = could not find container \"0c431dffff4ed0b67afb24c39bfbf04f24407065d13b7b907971de2735d6954d\": container with ID starting with 0c431dffff4ed0b67afb24c39bfbf04f24407065d13b7b907971de2735d6954d not found: ID does not exist" Dec 01 10:15:01 crc kubenswrapper[4867]: I1201 10:15:01.986293 4867 generic.go:334] "Generic (PLEG): container finished" podID="28a56290-ba72-4944-be33-f3b057926eb7" containerID="4ed7fd4cf2fdb88a3c1ab4d92bf7309b08329f199852d502381e87137d6cbdec" exitCode=0 Dec 01 10:15:01 crc kubenswrapper[4867]: I1201 10:15:01.986438 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-z5lgz" event={"ID":"28a56290-ba72-4944-be33-f3b057926eb7","Type":"ContainerDied","Data":"4ed7fd4cf2fdb88a3c1ab4d92bf7309b08329f199852d502381e87137d6cbdec"} Dec 01 10:15:01 crc kubenswrapper[4867]: I1201 10:15:01.986633 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-z5lgz" event={"ID":"28a56290-ba72-4944-be33-f3b057926eb7","Type":"ContainerStarted","Data":"0b9a58c8210aacd7a43036b84c387892cb92a1e5ab1ba9d2a32aa695ea4c864f"} Dec 01 10:15:02 crc kubenswrapper[4867]: I1201 10:15:02.842984 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b107ca4-8921-495d-a9d4-4539d1b07505" path="/var/lib/kubelet/pods/8b107ca4-8921-495d-a9d4-4539d1b07505/volumes" Dec 01 10:15:03 crc kubenswrapper[4867]: I1201 10:15:03.726050 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-z5lgz" Dec 01 10:15:03 crc kubenswrapper[4867]: I1201 10:15:03.854425 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28a56290-ba72-4944-be33-f3b057926eb7-secret-volume\") pod \"28a56290-ba72-4944-be33-f3b057926eb7\" (UID: \"28a56290-ba72-4944-be33-f3b057926eb7\") " Dec 01 10:15:03 crc kubenswrapper[4867]: I1201 10:15:03.854858 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4nxn\" (UniqueName: \"kubernetes.io/projected/28a56290-ba72-4944-be33-f3b057926eb7-kube-api-access-m4nxn\") pod \"28a56290-ba72-4944-be33-f3b057926eb7\" (UID: \"28a56290-ba72-4944-be33-f3b057926eb7\") " Dec 01 10:15:03 crc kubenswrapper[4867]: I1201 10:15:03.855042 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28a56290-ba72-4944-be33-f3b057926eb7-config-volume\") pod \"28a56290-ba72-4944-be33-f3b057926eb7\" (UID: \"28a56290-ba72-4944-be33-f3b057926eb7\") " Dec 01 10:15:03 crc kubenswrapper[4867]: I1201 10:15:03.855550 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28a56290-ba72-4944-be33-f3b057926eb7-config-volume" (OuterVolumeSpecName: "config-volume") pod "28a56290-ba72-4944-be33-f3b057926eb7" (UID: "28a56290-ba72-4944-be33-f3b057926eb7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:15:03 crc kubenswrapper[4867]: I1201 10:15:03.861997 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a56290-ba72-4944-be33-f3b057926eb7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "28a56290-ba72-4944-be33-f3b057926eb7" (UID: "28a56290-ba72-4944-be33-f3b057926eb7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:15:03 crc kubenswrapper[4867]: I1201 10:15:03.865105 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28a56290-ba72-4944-be33-f3b057926eb7-kube-api-access-m4nxn" (OuterVolumeSpecName: "kube-api-access-m4nxn") pod "28a56290-ba72-4944-be33-f3b057926eb7" (UID: "28a56290-ba72-4944-be33-f3b057926eb7"). InnerVolumeSpecName "kube-api-access-m4nxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:15:03 crc kubenswrapper[4867]: I1201 10:15:03.957601 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28a56290-ba72-4944-be33-f3b057926eb7-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:03 crc kubenswrapper[4867]: I1201 10:15:03.957637 4867 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28a56290-ba72-4944-be33-f3b057926eb7-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:03 crc kubenswrapper[4867]: I1201 10:15:03.957650 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4nxn\" (UniqueName: \"kubernetes.io/projected/28a56290-ba72-4944-be33-f3b057926eb7-kube-api-access-m4nxn\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:04 crc kubenswrapper[4867]: I1201 10:15:04.007714 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-z5lgz" event={"ID":"28a56290-ba72-4944-be33-f3b057926eb7","Type":"ContainerDied","Data":"0b9a58c8210aacd7a43036b84c387892cb92a1e5ab1ba9d2a32aa695ea4c864f"} Dec 01 10:15:04 crc kubenswrapper[4867]: I1201 10:15:04.008104 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b9a58c8210aacd7a43036b84c387892cb92a1e5ab1ba9d2a32aa695ea4c864f" Dec 01 10:15:04 crc kubenswrapper[4867]: I1201 10:15:04.007766 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-z5lgz" Dec 01 10:15:04 crc kubenswrapper[4867]: I1201 10:15:04.799318 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z"] Dec 01 10:15:04 crc kubenswrapper[4867]: I1201 10:15:04.808350 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409690-5q42z"] Dec 01 10:15:04 crc kubenswrapper[4867]: I1201 10:15:04.855655 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b895e470-c6f1-4072-b66d-b6c3cb1791e8" path="/var/lib/kubelet/pods/b895e470-c6f1-4072-b66d-b6c3cb1791e8/volumes" Dec 01 10:15:13 crc kubenswrapper[4867]: I1201 10:15:13.712937 4867 trace.go:236] Trace[185595728]: "Calculate volume metrics of config for pod openstack/neutron-59b9c878df-5k6nq" (01-Dec-2025 10:15:10.988) (total time: 2708ms): Dec 01 10:15:13 crc kubenswrapper[4867]: Trace[185595728]: [2.708018941s] [2.708018941s] END Dec 01 10:15:13 crc kubenswrapper[4867]: E1201 10:15:13.887557 4867 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="2.901s" Dec 01 10:15:22 crc kubenswrapper[4867]: I1201 10:15:22.230505 4867 scope.go:117] "RemoveContainer" containerID="67f71ca5c1f144414e072478412624ac6438ededc97a3353c0696e206d24dfe1" Dec 01 10:15:51 crc kubenswrapper[4867]: I1201 10:15:51.601093 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:15:51 crc kubenswrapper[4867]: I1201 10:15:51.603011 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:16:21 crc kubenswrapper[4867]: I1201 10:16:21.601921 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:16:21 crc kubenswrapper[4867]: I1201 10:16:21.602485 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:16:51 crc kubenswrapper[4867]: I1201 10:16:51.601091 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:16:51 crc kubenswrapper[4867]: I1201 10:16:51.601660 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:16:51 crc kubenswrapper[4867]: I1201 10:16:51.601703 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 10:16:51 crc kubenswrapper[4867]: I1201 10:16:51.602465 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7038d3231484ebe59527ae4898f9d6a56414acfb58f056d0591d36edea976f2b"} pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:16:51 crc kubenswrapper[4867]: I1201 10:16:51.602529 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" containerID="cri-o://7038d3231484ebe59527ae4898f9d6a56414acfb58f056d0591d36edea976f2b" gracePeriod=600 Dec 01 10:16:51 crc kubenswrapper[4867]: I1201 10:16:51.814223 4867 generic.go:334] "Generic (PLEG): container finished" podID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerID="7038d3231484ebe59527ae4898f9d6a56414acfb58f056d0591d36edea976f2b" exitCode=0 Dec 01 10:16:51 crc kubenswrapper[4867]: I1201 10:16:51.814270 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerDied","Data":"7038d3231484ebe59527ae4898f9d6a56414acfb58f056d0591d36edea976f2b"} Dec 01 10:16:51 crc kubenswrapper[4867]: I1201 10:16:51.814308 4867 scope.go:117] "RemoveContainer" containerID="167a16a6333e2c83b39a3ee1291cdca6583be18cc42696e73feb9f9bfe9bbc7b" Dec 01 10:16:52 crc kubenswrapper[4867]: I1201 10:16:52.825140 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b"} Dec 01 10:18:51 crc kubenswrapper[4867]: I1201 10:18:51.601512 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:18:51 crc kubenswrapper[4867]: I1201 10:18:51.602056 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:19:21 crc kubenswrapper[4867]: I1201 10:19:21.601403 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:19:21 crc kubenswrapper[4867]: I1201 10:19:21.602039 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:19:51 crc kubenswrapper[4867]: I1201 10:19:51.601842 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:19:51 crc kubenswrapper[4867]: I1201 10:19:51.602448 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:19:51 crc kubenswrapper[4867]: I1201 10:19:51.602494 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 10:19:51 crc kubenswrapper[4867]: I1201 10:19:51.603234 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b"} pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:19:51 crc kubenswrapper[4867]: I1201 10:19:51.603285 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" containerID="cri-o://9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" gracePeriod=600 Dec 01 10:19:51 crc kubenswrapper[4867]: E1201 10:19:51.927683 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:19:52 crc kubenswrapper[4867]: I1201 10:19:52.595619 4867 generic.go:334] "Generic (PLEG): container finished" podID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" exitCode=0 Dec 01 10:19:52 crc kubenswrapper[4867]: I1201 10:19:52.595855 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerDied","Data":"9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b"} Dec 01 10:19:52 crc kubenswrapper[4867]: I1201 10:19:52.596070 4867 scope.go:117] "RemoveContainer" containerID="7038d3231484ebe59527ae4898f9d6a56414acfb58f056d0591d36edea976f2b" Dec 01 10:19:52 crc kubenswrapper[4867]: I1201 10:19:52.596917 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:19:52 crc kubenswrapper[4867]: E1201 10:19:52.597314 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:20:05 crc kubenswrapper[4867]: I1201 10:20:05.827545 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:20:05 crc kubenswrapper[4867]: E1201 10:20:05.828331 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:20:17 crc kubenswrapper[4867]: I1201 10:20:17.828420 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:20:17 crc kubenswrapper[4867]: E1201 10:20:17.829540 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:20:29 crc kubenswrapper[4867]: I1201 10:20:29.827614 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:20:29 crc kubenswrapper[4867]: E1201 10:20:29.828409 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:20:40 crc kubenswrapper[4867]: I1201 10:20:40.214123 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b89fs"] Dec 01 10:20:40 crc kubenswrapper[4867]: E1201 10:20:40.214931 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b107ca4-8921-495d-a9d4-4539d1b07505" containerName="extract-content" Dec 01 10:20:40 crc kubenswrapper[4867]: I1201 10:20:40.214942 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b107ca4-8921-495d-a9d4-4539d1b07505" containerName="extract-content" Dec 01 10:20:40 crc kubenswrapper[4867]: E1201 10:20:40.214969 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a56290-ba72-4944-be33-f3b057926eb7" containerName="collect-profiles" Dec 01 10:20:40 crc kubenswrapper[4867]: I1201 10:20:40.214975 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a56290-ba72-4944-be33-f3b057926eb7" containerName="collect-profiles" Dec 01 10:20:40 crc kubenswrapper[4867]: E1201 10:20:40.214989 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b107ca4-8921-495d-a9d4-4539d1b07505" containerName="extract-utilities" Dec 01 10:20:40 crc kubenswrapper[4867]: I1201 10:20:40.214995 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b107ca4-8921-495d-a9d4-4539d1b07505" containerName="extract-utilities" Dec 01 10:20:40 crc kubenswrapper[4867]: E1201 10:20:40.215005 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b107ca4-8921-495d-a9d4-4539d1b07505" containerName="registry-server" Dec 01 10:20:40 crc kubenswrapper[4867]: I1201 10:20:40.215012 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b107ca4-8921-495d-a9d4-4539d1b07505" containerName="registry-server" Dec 01 10:20:40 crc kubenswrapper[4867]: I1201 10:20:40.215180 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a56290-ba72-4944-be33-f3b057926eb7" containerName="collect-profiles" Dec 01 10:20:40 crc kubenswrapper[4867]: I1201 10:20:40.215192 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b107ca4-8921-495d-a9d4-4539d1b07505" containerName="registry-server" Dec 01 10:20:40 crc kubenswrapper[4867]: I1201 10:20:40.216434 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b89fs" Dec 01 10:20:40 crc kubenswrapper[4867]: I1201 10:20:40.228419 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b89fs"] Dec 01 10:20:40 crc kubenswrapper[4867]: I1201 10:20:40.238934 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b4199b5-bab4-4d51-b5e6-48c044a01841-catalog-content\") pod \"redhat-operators-b89fs\" (UID: \"4b4199b5-bab4-4d51-b5e6-48c044a01841\") " pod="openshift-marketplace/redhat-operators-b89fs" Dec 01 10:20:40 crc kubenswrapper[4867]: I1201 10:20:40.239121 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blkmj\" (UniqueName: \"kubernetes.io/projected/4b4199b5-bab4-4d51-b5e6-48c044a01841-kube-api-access-blkmj\") pod \"redhat-operators-b89fs\" (UID: \"4b4199b5-bab4-4d51-b5e6-48c044a01841\") " pod="openshift-marketplace/redhat-operators-b89fs" Dec 01 10:20:40 crc kubenswrapper[4867]: I1201 10:20:40.239182 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b4199b5-bab4-4d51-b5e6-48c044a01841-utilities\") pod \"redhat-operators-b89fs\" (UID: \"4b4199b5-bab4-4d51-b5e6-48c044a01841\") " pod="openshift-marketplace/redhat-operators-b89fs" Dec 01 10:20:40 crc kubenswrapper[4867]: I1201 10:20:40.340407 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blkmj\" (UniqueName: \"kubernetes.io/projected/4b4199b5-bab4-4d51-b5e6-48c044a01841-kube-api-access-blkmj\") pod \"redhat-operators-b89fs\" (UID: \"4b4199b5-bab4-4d51-b5e6-48c044a01841\") " pod="openshift-marketplace/redhat-operators-b89fs" Dec 01 10:20:40 crc kubenswrapper[4867]: I1201 10:20:40.340460 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b4199b5-bab4-4d51-b5e6-48c044a01841-utilities\") pod \"redhat-operators-b89fs\" (UID: \"4b4199b5-bab4-4d51-b5e6-48c044a01841\") " pod="openshift-marketplace/redhat-operators-b89fs" Dec 01 10:20:40 crc kubenswrapper[4867]: I1201 10:20:40.340496 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b4199b5-bab4-4d51-b5e6-48c044a01841-catalog-content\") pod \"redhat-operators-b89fs\" (UID: \"4b4199b5-bab4-4d51-b5e6-48c044a01841\") " pod="openshift-marketplace/redhat-operators-b89fs" Dec 01 10:20:40 crc kubenswrapper[4867]: I1201 10:20:40.341011 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b4199b5-bab4-4d51-b5e6-48c044a01841-utilities\") pod \"redhat-operators-b89fs\" (UID: \"4b4199b5-bab4-4d51-b5e6-48c044a01841\") " pod="openshift-marketplace/redhat-operators-b89fs" Dec 01 10:20:40 crc kubenswrapper[4867]: I1201 10:20:40.341031 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b4199b5-bab4-4d51-b5e6-48c044a01841-catalog-content\") pod \"redhat-operators-b89fs\" (UID: \"4b4199b5-bab4-4d51-b5e6-48c044a01841\") " pod="openshift-marketplace/redhat-operators-b89fs" Dec 01 10:20:40 crc kubenswrapper[4867]: I1201 10:20:40.362228 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blkmj\" (UniqueName: \"kubernetes.io/projected/4b4199b5-bab4-4d51-b5e6-48c044a01841-kube-api-access-blkmj\") pod \"redhat-operators-b89fs\" (UID: \"4b4199b5-bab4-4d51-b5e6-48c044a01841\") " pod="openshift-marketplace/redhat-operators-b89fs" Dec 01 10:20:40 crc kubenswrapper[4867]: I1201 10:20:40.534761 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b89fs" Dec 01 10:20:41 crc kubenswrapper[4867]: I1201 10:20:41.063609 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b89fs"] Dec 01 10:20:41 crc kubenswrapper[4867]: I1201 10:20:41.828197 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:20:41 crc kubenswrapper[4867]: E1201 10:20:41.828909 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:20:42 crc kubenswrapper[4867]: I1201 10:20:42.086704 4867 generic.go:334] "Generic (PLEG): container finished" podID="4b4199b5-bab4-4d51-b5e6-48c044a01841" containerID="df02eda5a60dd33ac4daaed9ca5c42f7794ecfff47c23e73e6529b7e271f3aa0" exitCode=0 Dec 01 10:20:42 crc kubenswrapper[4867]: I1201 10:20:42.086900 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b89fs" event={"ID":"4b4199b5-bab4-4d51-b5e6-48c044a01841","Type":"ContainerDied","Data":"df02eda5a60dd33ac4daaed9ca5c42f7794ecfff47c23e73e6529b7e271f3aa0"} Dec 01 10:20:42 crc kubenswrapper[4867]: I1201 10:20:42.087004 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b89fs" event={"ID":"4b4199b5-bab4-4d51-b5e6-48c044a01841","Type":"ContainerStarted","Data":"0a0591a7c1e36d0778b039142a207c456d0de5d38ea5ebe3d260e51e84482c73"} Dec 01 10:20:42 crc kubenswrapper[4867]: I1201 10:20:42.089571 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:20:44 crc kubenswrapper[4867]: I1201 10:20:44.110717 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b89fs" event={"ID":"4b4199b5-bab4-4d51-b5e6-48c044a01841","Type":"ContainerStarted","Data":"574fddfecc0241f75a058f7ce1493cdc252b22222afc3fdface89f94cedc8d4b"} Dec 01 10:20:46 crc kubenswrapper[4867]: I1201 10:20:46.134997 4867 generic.go:334] "Generic (PLEG): container finished" podID="4b4199b5-bab4-4d51-b5e6-48c044a01841" containerID="574fddfecc0241f75a058f7ce1493cdc252b22222afc3fdface89f94cedc8d4b" exitCode=0 Dec 01 10:20:46 crc kubenswrapper[4867]: I1201 10:20:46.135207 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b89fs" event={"ID":"4b4199b5-bab4-4d51-b5e6-48c044a01841","Type":"ContainerDied","Data":"574fddfecc0241f75a058f7ce1493cdc252b22222afc3fdface89f94cedc8d4b"} Dec 01 10:20:47 crc kubenswrapper[4867]: I1201 10:20:47.146483 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b89fs" event={"ID":"4b4199b5-bab4-4d51-b5e6-48c044a01841","Type":"ContainerStarted","Data":"59267a3447189b12e345840903dae805d568068bd132c36932df4974776bf640"} Dec 01 10:20:47 crc kubenswrapper[4867]: I1201 10:20:47.180899 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b89fs" podStartSLOduration=2.698681059 podStartE2EDuration="7.180875749s" podCreationTimestamp="2025-12-01 10:20:40 +0000 UTC" firstStartedPulling="2025-12-01 10:20:42.089362411 +0000 UTC m=+4363.548749165" lastFinishedPulling="2025-12-01 10:20:46.571557101 +0000 UTC m=+4368.030943855" observedRunningTime="2025-12-01 10:20:47.173583119 +0000 UTC m=+4368.632969893" watchObservedRunningTime="2025-12-01 10:20:47.180875749 +0000 UTC m=+4368.640262503" Dec 01 10:20:50 crc kubenswrapper[4867]: I1201 10:20:50.535996 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b89fs" Dec 01 10:20:50 crc kubenswrapper[4867]: I1201 10:20:50.536446 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b89fs" Dec 01 10:20:51 crc kubenswrapper[4867]: I1201 10:20:51.587465 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b89fs" podUID="4b4199b5-bab4-4d51-b5e6-48c044a01841" containerName="registry-server" probeResult="failure" output=< Dec 01 10:20:51 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Dec 01 10:20:51 crc kubenswrapper[4867]: > Dec 01 10:20:53 crc kubenswrapper[4867]: I1201 10:20:53.827273 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:20:53 crc kubenswrapper[4867]: E1201 10:20:53.827684 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:21:00 crc kubenswrapper[4867]: I1201 10:21:00.590289 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b89fs" Dec 01 10:21:00 crc kubenswrapper[4867]: I1201 10:21:00.636395 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b89fs" Dec 01 10:21:00 crc kubenswrapper[4867]: I1201 10:21:00.844751 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b89fs"] Dec 01 10:21:02 crc kubenswrapper[4867]: I1201 10:21:02.258483 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b89fs" podUID="4b4199b5-bab4-4d51-b5e6-48c044a01841" containerName="registry-server" containerID="cri-o://59267a3447189b12e345840903dae805d568068bd132c36932df4974776bf640" gracePeriod=2 Dec 01 10:21:03 crc kubenswrapper[4867]: I1201 10:21:03.270724 4867 generic.go:334] "Generic (PLEG): container finished" podID="4b4199b5-bab4-4d51-b5e6-48c044a01841" containerID="59267a3447189b12e345840903dae805d568068bd132c36932df4974776bf640" exitCode=0 Dec 01 10:21:03 crc kubenswrapper[4867]: I1201 10:21:03.270844 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b89fs" event={"ID":"4b4199b5-bab4-4d51-b5e6-48c044a01841","Type":"ContainerDied","Data":"59267a3447189b12e345840903dae805d568068bd132c36932df4974776bf640"} Dec 01 10:21:03 crc kubenswrapper[4867]: I1201 10:21:03.271055 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b89fs" event={"ID":"4b4199b5-bab4-4d51-b5e6-48c044a01841","Type":"ContainerDied","Data":"0a0591a7c1e36d0778b039142a207c456d0de5d38ea5ebe3d260e51e84482c73"} Dec 01 10:21:03 crc kubenswrapper[4867]: I1201 10:21:03.271074 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a0591a7c1e36d0778b039142a207c456d0de5d38ea5ebe3d260e51e84482c73" Dec 01 10:21:03 crc kubenswrapper[4867]: I1201 10:21:03.313601 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b89fs" Dec 01 10:21:03 crc kubenswrapper[4867]: I1201 10:21:03.401788 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blkmj\" (UniqueName: \"kubernetes.io/projected/4b4199b5-bab4-4d51-b5e6-48c044a01841-kube-api-access-blkmj\") pod \"4b4199b5-bab4-4d51-b5e6-48c044a01841\" (UID: \"4b4199b5-bab4-4d51-b5e6-48c044a01841\") " Dec 01 10:21:03 crc kubenswrapper[4867]: I1201 10:21:03.402019 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b4199b5-bab4-4d51-b5e6-48c044a01841-catalog-content\") pod \"4b4199b5-bab4-4d51-b5e6-48c044a01841\" (UID: \"4b4199b5-bab4-4d51-b5e6-48c044a01841\") " Dec 01 10:21:03 crc kubenswrapper[4867]: I1201 10:21:03.402088 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b4199b5-bab4-4d51-b5e6-48c044a01841-utilities\") pod \"4b4199b5-bab4-4d51-b5e6-48c044a01841\" (UID: \"4b4199b5-bab4-4d51-b5e6-48c044a01841\") " Dec 01 10:21:03 crc kubenswrapper[4867]: I1201 10:21:03.402988 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b4199b5-bab4-4d51-b5e6-48c044a01841-utilities" (OuterVolumeSpecName: "utilities") pod "4b4199b5-bab4-4d51-b5e6-48c044a01841" (UID: "4b4199b5-bab4-4d51-b5e6-48c044a01841"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:21:03 crc kubenswrapper[4867]: I1201 10:21:03.407727 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b4199b5-bab4-4d51-b5e6-48c044a01841-kube-api-access-blkmj" (OuterVolumeSpecName: "kube-api-access-blkmj") pod "4b4199b5-bab4-4d51-b5e6-48c044a01841" (UID: "4b4199b5-bab4-4d51-b5e6-48c044a01841"). InnerVolumeSpecName "kube-api-access-blkmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:21:03 crc kubenswrapper[4867]: I1201 10:21:03.502967 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b4199b5-bab4-4d51-b5e6-48c044a01841-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b4199b5-bab4-4d51-b5e6-48c044a01841" (UID: "4b4199b5-bab4-4d51-b5e6-48c044a01841"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:21:03 crc kubenswrapper[4867]: I1201 10:21:03.504205 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blkmj\" (UniqueName: \"kubernetes.io/projected/4b4199b5-bab4-4d51-b5e6-48c044a01841-kube-api-access-blkmj\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:03 crc kubenswrapper[4867]: I1201 10:21:03.504227 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b4199b5-bab4-4d51-b5e6-48c044a01841-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:03 crc kubenswrapper[4867]: I1201 10:21:03.504235 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b4199b5-bab4-4d51-b5e6-48c044a01841-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:04 crc kubenswrapper[4867]: I1201 10:21:04.278594 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b89fs" Dec 01 10:21:04 crc kubenswrapper[4867]: I1201 10:21:04.323538 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b89fs"] Dec 01 10:21:04 crc kubenswrapper[4867]: I1201 10:21:04.333772 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b89fs"] Dec 01 10:21:04 crc kubenswrapper[4867]: I1201 10:21:04.838014 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b4199b5-bab4-4d51-b5e6-48c044a01841" path="/var/lib/kubelet/pods/4b4199b5-bab4-4d51-b5e6-48c044a01841/volumes" Dec 01 10:21:06 crc kubenswrapper[4867]: I1201 10:21:06.828049 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:21:06 crc kubenswrapper[4867]: E1201 10:21:06.828561 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:21:18 crc kubenswrapper[4867]: I1201 10:21:18.843138 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:21:18 crc kubenswrapper[4867]: E1201 10:21:18.843990 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:21:31 crc kubenswrapper[4867]: I1201 10:21:31.827617 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:21:31 crc kubenswrapper[4867]: E1201 10:21:31.828505 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:21:43 crc kubenswrapper[4867]: I1201 10:21:43.827690 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:21:43 crc kubenswrapper[4867]: E1201 10:21:43.828552 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:21:58 crc kubenswrapper[4867]: I1201 10:21:58.833158 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:21:58 crc kubenswrapper[4867]: E1201 10:21:58.833977 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:22:12 crc kubenswrapper[4867]: I1201 10:22:12.826710 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:22:12 crc kubenswrapper[4867]: E1201 10:22:12.827770 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:22:25 crc kubenswrapper[4867]: I1201 10:22:25.827538 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:22:25 crc kubenswrapper[4867]: E1201 10:22:25.828893 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:22:37 crc kubenswrapper[4867]: I1201 10:22:37.827606 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:22:37 crc kubenswrapper[4867]: E1201 10:22:37.828517 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:22:51 crc kubenswrapper[4867]: I1201 10:22:51.827405 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:22:51 crc kubenswrapper[4867]: E1201 10:22:51.828330 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:23:06 crc kubenswrapper[4867]: I1201 10:23:06.827158 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:23:06 crc kubenswrapper[4867]: E1201 10:23:06.828021 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:23:17 crc kubenswrapper[4867]: I1201 10:23:17.827520 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:23:17 crc kubenswrapper[4867]: E1201 10:23:17.829031 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:23:30 crc kubenswrapper[4867]: I1201 10:23:30.828708 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:23:30 crc kubenswrapper[4867]: E1201 10:23:30.829600 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:23:41 crc kubenswrapper[4867]: I1201 10:23:41.827476 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:23:41 crc kubenswrapper[4867]: E1201 10:23:41.829235 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:23:53 crc kubenswrapper[4867]: I1201 10:23:53.826932 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:23:53 crc kubenswrapper[4867]: E1201 10:23:53.827735 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:24:04 crc kubenswrapper[4867]: I1201 10:24:04.826941 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:24:04 crc kubenswrapper[4867]: E1201 10:24:04.827605 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:24:10 crc kubenswrapper[4867]: I1201 10:24:10.603590 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n2r4r"] Dec 01 10:24:10 crc kubenswrapper[4867]: E1201 10:24:10.606256 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b4199b5-bab4-4d51-b5e6-48c044a01841" containerName="extract-content" Dec 01 10:24:10 crc kubenswrapper[4867]: I1201 10:24:10.606276 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b4199b5-bab4-4d51-b5e6-48c044a01841" containerName="extract-content" Dec 01 10:24:10 crc kubenswrapper[4867]: E1201 10:24:10.606314 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b4199b5-bab4-4d51-b5e6-48c044a01841" containerName="registry-server" Dec 01 10:24:10 crc kubenswrapper[4867]: I1201 10:24:10.606321 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b4199b5-bab4-4d51-b5e6-48c044a01841" containerName="registry-server" Dec 01 10:24:10 crc kubenswrapper[4867]: E1201 10:24:10.606339 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b4199b5-bab4-4d51-b5e6-48c044a01841" containerName="extract-utilities" Dec 01 10:24:10 crc kubenswrapper[4867]: I1201 10:24:10.606346 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b4199b5-bab4-4d51-b5e6-48c044a01841" containerName="extract-utilities" Dec 01 10:24:10 crc kubenswrapper[4867]: I1201 10:24:10.608259 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b4199b5-bab4-4d51-b5e6-48c044a01841" containerName="registry-server" Dec 01 10:24:10 crc kubenswrapper[4867]: I1201 10:24:10.628452 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2r4r" Dec 01 10:24:10 crc kubenswrapper[4867]: I1201 10:24:10.700893 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n2r4r"] Dec 01 10:24:10 crc kubenswrapper[4867]: I1201 10:24:10.780554 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08ddab6f-4d02-473d-9e44-60bd0ec24a73-utilities\") pod \"community-operators-n2r4r\" (UID: \"08ddab6f-4d02-473d-9e44-60bd0ec24a73\") " pod="openshift-marketplace/community-operators-n2r4r" Dec 01 10:24:10 crc kubenswrapper[4867]: I1201 10:24:10.780711 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08ddab6f-4d02-473d-9e44-60bd0ec24a73-catalog-content\") pod \"community-operators-n2r4r\" (UID: \"08ddab6f-4d02-473d-9e44-60bd0ec24a73\") " pod="openshift-marketplace/community-operators-n2r4r" Dec 01 10:24:10 crc kubenswrapper[4867]: I1201 10:24:10.780763 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qp9x\" (UniqueName: \"kubernetes.io/projected/08ddab6f-4d02-473d-9e44-60bd0ec24a73-kube-api-access-9qp9x\") pod \"community-operators-n2r4r\" (UID: \"08ddab6f-4d02-473d-9e44-60bd0ec24a73\") " pod="openshift-marketplace/community-operators-n2r4r" Dec 01 10:24:10 crc kubenswrapper[4867]: I1201 10:24:10.882229 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08ddab6f-4d02-473d-9e44-60bd0ec24a73-catalog-content\") pod \"community-operators-n2r4r\" (UID: \"08ddab6f-4d02-473d-9e44-60bd0ec24a73\") " pod="openshift-marketplace/community-operators-n2r4r" Dec 01 10:24:10 crc kubenswrapper[4867]: I1201 10:24:10.882304 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qp9x\" (UniqueName: \"kubernetes.io/projected/08ddab6f-4d02-473d-9e44-60bd0ec24a73-kube-api-access-9qp9x\") pod \"community-operators-n2r4r\" (UID: \"08ddab6f-4d02-473d-9e44-60bd0ec24a73\") " pod="openshift-marketplace/community-operators-n2r4r" Dec 01 10:24:10 crc kubenswrapper[4867]: I1201 10:24:10.882370 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08ddab6f-4d02-473d-9e44-60bd0ec24a73-utilities\") pod \"community-operators-n2r4r\" (UID: \"08ddab6f-4d02-473d-9e44-60bd0ec24a73\") " pod="openshift-marketplace/community-operators-n2r4r" Dec 01 10:24:10 crc kubenswrapper[4867]: I1201 10:24:10.882703 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08ddab6f-4d02-473d-9e44-60bd0ec24a73-catalog-content\") pod \"community-operators-n2r4r\" (UID: \"08ddab6f-4d02-473d-9e44-60bd0ec24a73\") " pod="openshift-marketplace/community-operators-n2r4r" Dec 01 10:24:10 crc kubenswrapper[4867]: I1201 10:24:10.882853 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08ddab6f-4d02-473d-9e44-60bd0ec24a73-utilities\") pod \"community-operators-n2r4r\" (UID: \"08ddab6f-4d02-473d-9e44-60bd0ec24a73\") " pod="openshift-marketplace/community-operators-n2r4r" Dec 01 10:24:11 crc kubenswrapper[4867]: I1201 10:24:11.016800 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qp9x\" (UniqueName: \"kubernetes.io/projected/08ddab6f-4d02-473d-9e44-60bd0ec24a73-kube-api-access-9qp9x\") pod \"community-operators-n2r4r\" (UID: \"08ddab6f-4d02-473d-9e44-60bd0ec24a73\") " pod="openshift-marketplace/community-operators-n2r4r" Dec 01 10:24:11 crc kubenswrapper[4867]: I1201 10:24:11.284628 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2r4r" Dec 01 10:24:12 crc kubenswrapper[4867]: I1201 10:24:12.077273 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n2r4r"] Dec 01 10:24:12 crc kubenswrapper[4867]: I1201 10:24:12.276271 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2r4r" event={"ID":"08ddab6f-4d02-473d-9e44-60bd0ec24a73","Type":"ContainerStarted","Data":"c12c7448e6c17f9aa5af86cc98f77eff72422944292e473496d110fe0a97df29"} Dec 01 10:24:13 crc kubenswrapper[4867]: I1201 10:24:13.286040 4867 generic.go:334] "Generic (PLEG): container finished" podID="08ddab6f-4d02-473d-9e44-60bd0ec24a73" containerID="12d2fcd5a01e09dad51626872cc24c5040bc966d294a32ee008f143d7153ac1b" exitCode=0 Dec 01 10:24:13 crc kubenswrapper[4867]: I1201 10:24:13.286086 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2r4r" event={"ID":"08ddab6f-4d02-473d-9e44-60bd0ec24a73","Type":"ContainerDied","Data":"12d2fcd5a01e09dad51626872cc24c5040bc966d294a32ee008f143d7153ac1b"} Dec 01 10:24:14 crc kubenswrapper[4867]: I1201 10:24:14.299759 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2r4r" event={"ID":"08ddab6f-4d02-473d-9e44-60bd0ec24a73","Type":"ContainerStarted","Data":"8081ebfec01f4fee5e09edd3ccbd66142ff8b6ef7a68314a32efec1b3254cbcd"} Dec 01 10:24:15 crc kubenswrapper[4867]: I1201 10:24:15.329636 4867 generic.go:334] "Generic (PLEG): container finished" podID="08ddab6f-4d02-473d-9e44-60bd0ec24a73" containerID="8081ebfec01f4fee5e09edd3ccbd66142ff8b6ef7a68314a32efec1b3254cbcd" exitCode=0 Dec 01 10:24:15 crc kubenswrapper[4867]: I1201 10:24:15.329692 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2r4r" event={"ID":"08ddab6f-4d02-473d-9e44-60bd0ec24a73","Type":"ContainerDied","Data":"8081ebfec01f4fee5e09edd3ccbd66142ff8b6ef7a68314a32efec1b3254cbcd"} Dec 01 10:24:15 crc kubenswrapper[4867]: I1201 10:24:15.827184 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:24:15 crc kubenswrapper[4867]: E1201 10:24:15.827856 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:24:16 crc kubenswrapper[4867]: I1201 10:24:16.343508 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2r4r" event={"ID":"08ddab6f-4d02-473d-9e44-60bd0ec24a73","Type":"ContainerStarted","Data":"dab9d2ca82318977b0059e6bb440df7fb58ecfcf1c04a1b9559bbec786fa9d2f"} Dec 01 10:24:16 crc kubenswrapper[4867]: I1201 10:24:16.365583 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n2r4r" podStartSLOduration=3.819774823 podStartE2EDuration="6.365558527s" podCreationTimestamp="2025-12-01 10:24:10 +0000 UTC" firstStartedPulling="2025-12-01 10:24:13.288136863 +0000 UTC m=+4574.747523617" lastFinishedPulling="2025-12-01 10:24:15.833920567 +0000 UTC m=+4577.293307321" observedRunningTime="2025-12-01 10:24:16.362832002 +0000 UTC m=+4577.822218776" watchObservedRunningTime="2025-12-01 10:24:16.365558527 +0000 UTC m=+4577.824945291" Dec 01 10:24:21 crc kubenswrapper[4867]: I1201 10:24:21.285384 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n2r4r" Dec 01 10:24:21 crc kubenswrapper[4867]: I1201 10:24:21.285711 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n2r4r" Dec 01 10:24:21 crc kubenswrapper[4867]: I1201 10:24:21.334588 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n2r4r" Dec 01 10:24:21 crc kubenswrapper[4867]: I1201 10:24:21.427348 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n2r4r" Dec 01 10:24:21 crc kubenswrapper[4867]: I1201 10:24:21.568243 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n2r4r"] Dec 01 10:24:23 crc kubenswrapper[4867]: I1201 10:24:23.397984 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n2r4r" podUID="08ddab6f-4d02-473d-9e44-60bd0ec24a73" containerName="registry-server" containerID="cri-o://dab9d2ca82318977b0059e6bb440df7fb58ecfcf1c04a1b9559bbec786fa9d2f" gracePeriod=2 Dec 01 10:24:23 crc kubenswrapper[4867]: I1201 10:24:23.964015 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2r4r" Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.135732 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08ddab6f-4d02-473d-9e44-60bd0ec24a73-utilities\") pod \"08ddab6f-4d02-473d-9e44-60bd0ec24a73\" (UID: \"08ddab6f-4d02-473d-9e44-60bd0ec24a73\") " Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.135834 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qp9x\" (UniqueName: \"kubernetes.io/projected/08ddab6f-4d02-473d-9e44-60bd0ec24a73-kube-api-access-9qp9x\") pod \"08ddab6f-4d02-473d-9e44-60bd0ec24a73\" (UID: \"08ddab6f-4d02-473d-9e44-60bd0ec24a73\") " Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.136016 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08ddab6f-4d02-473d-9e44-60bd0ec24a73-catalog-content\") pod \"08ddab6f-4d02-473d-9e44-60bd0ec24a73\" (UID: \"08ddab6f-4d02-473d-9e44-60bd0ec24a73\") " Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.137341 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08ddab6f-4d02-473d-9e44-60bd0ec24a73-utilities" (OuterVolumeSpecName: "utilities") pod "08ddab6f-4d02-473d-9e44-60bd0ec24a73" (UID: "08ddab6f-4d02-473d-9e44-60bd0ec24a73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.147174 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08ddab6f-4d02-473d-9e44-60bd0ec24a73-kube-api-access-9qp9x" (OuterVolumeSpecName: "kube-api-access-9qp9x") pod "08ddab6f-4d02-473d-9e44-60bd0ec24a73" (UID: "08ddab6f-4d02-473d-9e44-60bd0ec24a73"). InnerVolumeSpecName "kube-api-access-9qp9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.207387 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08ddab6f-4d02-473d-9e44-60bd0ec24a73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08ddab6f-4d02-473d-9e44-60bd0ec24a73" (UID: "08ddab6f-4d02-473d-9e44-60bd0ec24a73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.238004 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08ddab6f-4d02-473d-9e44-60bd0ec24a73-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.238319 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qp9x\" (UniqueName: \"kubernetes.io/projected/08ddab6f-4d02-473d-9e44-60bd0ec24a73-kube-api-access-9qp9x\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.238619 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08ddab6f-4d02-473d-9e44-60bd0ec24a73-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.408669 4867 generic.go:334] "Generic (PLEG): container finished" podID="08ddab6f-4d02-473d-9e44-60bd0ec24a73" containerID="dab9d2ca82318977b0059e6bb440df7fb58ecfcf1c04a1b9559bbec786fa9d2f" exitCode=0 Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.408737 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2r4r" Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.408757 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2r4r" event={"ID":"08ddab6f-4d02-473d-9e44-60bd0ec24a73","Type":"ContainerDied","Data":"dab9d2ca82318977b0059e6bb440df7fb58ecfcf1c04a1b9559bbec786fa9d2f"} Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.409106 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2r4r" event={"ID":"08ddab6f-4d02-473d-9e44-60bd0ec24a73","Type":"ContainerDied","Data":"c12c7448e6c17f9aa5af86cc98f77eff72422944292e473496d110fe0a97df29"} Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.409128 4867 scope.go:117] "RemoveContainer" containerID="dab9d2ca82318977b0059e6bb440df7fb58ecfcf1c04a1b9559bbec786fa9d2f" Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.432042 4867 scope.go:117] "RemoveContainer" containerID="8081ebfec01f4fee5e09edd3ccbd66142ff8b6ef7a68314a32efec1b3254cbcd" Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.449228 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n2r4r"] Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.461571 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n2r4r"] Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.462570 4867 scope.go:117] "RemoveContainer" containerID="12d2fcd5a01e09dad51626872cc24c5040bc966d294a32ee008f143d7153ac1b" Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.506440 4867 scope.go:117] "RemoveContainer" containerID="dab9d2ca82318977b0059e6bb440df7fb58ecfcf1c04a1b9559bbec786fa9d2f" Dec 01 10:24:24 crc kubenswrapper[4867]: E1201 10:24:24.506987 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dab9d2ca82318977b0059e6bb440df7fb58ecfcf1c04a1b9559bbec786fa9d2f\": container with ID starting with dab9d2ca82318977b0059e6bb440df7fb58ecfcf1c04a1b9559bbec786fa9d2f not found: ID does not exist" containerID="dab9d2ca82318977b0059e6bb440df7fb58ecfcf1c04a1b9559bbec786fa9d2f" Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.507020 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab9d2ca82318977b0059e6bb440df7fb58ecfcf1c04a1b9559bbec786fa9d2f"} err="failed to get container status \"dab9d2ca82318977b0059e6bb440df7fb58ecfcf1c04a1b9559bbec786fa9d2f\": rpc error: code = NotFound desc = could not find container \"dab9d2ca82318977b0059e6bb440df7fb58ecfcf1c04a1b9559bbec786fa9d2f\": container with ID starting with dab9d2ca82318977b0059e6bb440df7fb58ecfcf1c04a1b9559bbec786fa9d2f not found: ID does not exist" Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.507047 4867 scope.go:117] "RemoveContainer" containerID="8081ebfec01f4fee5e09edd3ccbd66142ff8b6ef7a68314a32efec1b3254cbcd" Dec 01 10:24:24 crc kubenswrapper[4867]: E1201 10:24:24.507379 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8081ebfec01f4fee5e09edd3ccbd66142ff8b6ef7a68314a32efec1b3254cbcd\": container with ID starting with 8081ebfec01f4fee5e09edd3ccbd66142ff8b6ef7a68314a32efec1b3254cbcd not found: ID does not exist" containerID="8081ebfec01f4fee5e09edd3ccbd66142ff8b6ef7a68314a32efec1b3254cbcd" Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.507403 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8081ebfec01f4fee5e09edd3ccbd66142ff8b6ef7a68314a32efec1b3254cbcd"} err="failed to get container status \"8081ebfec01f4fee5e09edd3ccbd66142ff8b6ef7a68314a32efec1b3254cbcd\": rpc error: code = NotFound desc = could not find container \"8081ebfec01f4fee5e09edd3ccbd66142ff8b6ef7a68314a32efec1b3254cbcd\": container with ID starting with 8081ebfec01f4fee5e09edd3ccbd66142ff8b6ef7a68314a32efec1b3254cbcd not found: ID does not exist" Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.507420 4867 scope.go:117] "RemoveContainer" containerID="12d2fcd5a01e09dad51626872cc24c5040bc966d294a32ee008f143d7153ac1b" Dec 01 10:24:24 crc kubenswrapper[4867]: E1201 10:24:24.507647 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12d2fcd5a01e09dad51626872cc24c5040bc966d294a32ee008f143d7153ac1b\": container with ID starting with 12d2fcd5a01e09dad51626872cc24c5040bc966d294a32ee008f143d7153ac1b not found: ID does not exist" containerID="12d2fcd5a01e09dad51626872cc24c5040bc966d294a32ee008f143d7153ac1b" Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.507672 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12d2fcd5a01e09dad51626872cc24c5040bc966d294a32ee008f143d7153ac1b"} err="failed to get container status \"12d2fcd5a01e09dad51626872cc24c5040bc966d294a32ee008f143d7153ac1b\": rpc error: code = NotFound desc = could not find container \"12d2fcd5a01e09dad51626872cc24c5040bc966d294a32ee008f143d7153ac1b\": container with ID starting with 12d2fcd5a01e09dad51626872cc24c5040bc966d294a32ee008f143d7153ac1b not found: ID does not exist" Dec 01 10:24:24 crc kubenswrapper[4867]: I1201 10:24:24.837802 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08ddab6f-4d02-473d-9e44-60bd0ec24a73" path="/var/lib/kubelet/pods/08ddab6f-4d02-473d-9e44-60bd0ec24a73/volumes" Dec 01 10:24:28 crc kubenswrapper[4867]: I1201 10:24:28.835369 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:24:28 crc kubenswrapper[4867]: E1201 10:24:28.836200 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:24:39 crc kubenswrapper[4867]: I1201 10:24:39.827728 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:24:39 crc kubenswrapper[4867]: E1201 10:24:39.828884 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:24:54 crc kubenswrapper[4867]: I1201 10:24:54.827792 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:24:55 crc kubenswrapper[4867]: I1201 10:24:55.692356 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"d5239f7b77b22875fa81a2f4170c3ec389ab40f1ee4ab2299324fc1ac724ac7b"} Dec 01 10:26:52 crc kubenswrapper[4867]: I1201 10:26:52.724842 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rrb9z"] Dec 01 10:26:52 crc kubenswrapper[4867]: E1201 10:26:52.725871 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ddab6f-4d02-473d-9e44-60bd0ec24a73" containerName="extract-content" Dec 01 10:26:52 crc kubenswrapper[4867]: I1201 10:26:52.725888 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ddab6f-4d02-473d-9e44-60bd0ec24a73" containerName="extract-content" Dec 01 10:26:52 crc kubenswrapper[4867]: E1201 10:26:52.725935 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ddab6f-4d02-473d-9e44-60bd0ec24a73" containerName="extract-utilities" Dec 01 10:26:52 crc kubenswrapper[4867]: I1201 10:26:52.725944 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ddab6f-4d02-473d-9e44-60bd0ec24a73" containerName="extract-utilities" Dec 01 10:26:52 crc kubenswrapper[4867]: E1201 10:26:52.725964 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ddab6f-4d02-473d-9e44-60bd0ec24a73" containerName="registry-server" Dec 01 10:26:52 crc kubenswrapper[4867]: I1201 10:26:52.725972 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ddab6f-4d02-473d-9e44-60bd0ec24a73" containerName="registry-server" Dec 01 10:26:52 crc kubenswrapper[4867]: I1201 10:26:52.726178 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="08ddab6f-4d02-473d-9e44-60bd0ec24a73" containerName="registry-server" Dec 01 10:26:52 crc kubenswrapper[4867]: I1201 10:26:52.727967 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rrb9z" Dec 01 10:26:52 crc kubenswrapper[4867]: I1201 10:26:52.737676 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rrb9z"] Dec 01 10:26:52 crc kubenswrapper[4867]: I1201 10:26:52.769964 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35a899ec-abcc-42c5-9f2a-61a9e4f0c801-catalog-content\") pod \"certified-operators-rrb9z\" (UID: \"35a899ec-abcc-42c5-9f2a-61a9e4f0c801\") " pod="openshift-marketplace/certified-operators-rrb9z" Dec 01 10:26:52 crc kubenswrapper[4867]: I1201 10:26:52.770244 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sddnt\" (UniqueName: \"kubernetes.io/projected/35a899ec-abcc-42c5-9f2a-61a9e4f0c801-kube-api-access-sddnt\") pod \"certified-operators-rrb9z\" (UID: \"35a899ec-abcc-42c5-9f2a-61a9e4f0c801\") " pod="openshift-marketplace/certified-operators-rrb9z" Dec 01 10:26:52 crc kubenswrapper[4867]: I1201 10:26:52.770353 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35a899ec-abcc-42c5-9f2a-61a9e4f0c801-utilities\") pod \"certified-operators-rrb9z\" (UID: \"35a899ec-abcc-42c5-9f2a-61a9e4f0c801\") " pod="openshift-marketplace/certified-operators-rrb9z" Dec 01 10:26:52 crc kubenswrapper[4867]: I1201 10:26:52.872189 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35a899ec-abcc-42c5-9f2a-61a9e4f0c801-catalog-content\") pod \"certified-operators-rrb9z\" (UID: \"35a899ec-abcc-42c5-9f2a-61a9e4f0c801\") " pod="openshift-marketplace/certified-operators-rrb9z" Dec 01 10:26:52 crc kubenswrapper[4867]: I1201 10:26:52.872290 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sddnt\" (UniqueName: \"kubernetes.io/projected/35a899ec-abcc-42c5-9f2a-61a9e4f0c801-kube-api-access-sddnt\") pod \"certified-operators-rrb9z\" (UID: \"35a899ec-abcc-42c5-9f2a-61a9e4f0c801\") " pod="openshift-marketplace/certified-operators-rrb9z" Dec 01 10:26:52 crc kubenswrapper[4867]: I1201 10:26:52.872334 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35a899ec-abcc-42c5-9f2a-61a9e4f0c801-utilities\") pod \"certified-operators-rrb9z\" (UID: \"35a899ec-abcc-42c5-9f2a-61a9e4f0c801\") " pod="openshift-marketplace/certified-operators-rrb9z" Dec 01 10:26:52 crc kubenswrapper[4867]: I1201 10:26:52.873401 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35a899ec-abcc-42c5-9f2a-61a9e4f0c801-utilities\") pod \"certified-operators-rrb9z\" (UID: \"35a899ec-abcc-42c5-9f2a-61a9e4f0c801\") " pod="openshift-marketplace/certified-operators-rrb9z" Dec 01 10:26:52 crc kubenswrapper[4867]: I1201 10:26:52.873397 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35a899ec-abcc-42c5-9f2a-61a9e4f0c801-catalog-content\") pod \"certified-operators-rrb9z\" (UID: \"35a899ec-abcc-42c5-9f2a-61a9e4f0c801\") " pod="openshift-marketplace/certified-operators-rrb9z" Dec 01 10:26:52 crc kubenswrapper[4867]: I1201 10:26:52.900174 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sddnt\" (UniqueName: \"kubernetes.io/projected/35a899ec-abcc-42c5-9f2a-61a9e4f0c801-kube-api-access-sddnt\") pod \"certified-operators-rrb9z\" (UID: \"35a899ec-abcc-42c5-9f2a-61a9e4f0c801\") " pod="openshift-marketplace/certified-operators-rrb9z" Dec 01 10:26:53 crc kubenswrapper[4867]: I1201 10:26:53.051847 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rrb9z" Dec 01 10:26:53 crc kubenswrapper[4867]: I1201 10:26:53.684773 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rrb9z"] Dec 01 10:26:53 crc kubenswrapper[4867]: I1201 10:26:53.707372 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrb9z" event={"ID":"35a899ec-abcc-42c5-9f2a-61a9e4f0c801","Type":"ContainerStarted","Data":"7f05c8e7cffbded9ede1a3abb36f3f297ac139d5bf1e5a17151434cbd4ce2e92"} Dec 01 10:26:54 crc kubenswrapper[4867]: I1201 10:26:54.733899 4867 generic.go:334] "Generic (PLEG): container finished" podID="35a899ec-abcc-42c5-9f2a-61a9e4f0c801" containerID="f20aacb4675766b0e996ebc084026415ae0c950dcc91f89c1299c8d6ea2af3f9" exitCode=0 Dec 01 10:26:54 crc kubenswrapper[4867]: I1201 10:26:54.734007 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrb9z" event={"ID":"35a899ec-abcc-42c5-9f2a-61a9e4f0c801","Type":"ContainerDied","Data":"f20aacb4675766b0e996ebc084026415ae0c950dcc91f89c1299c8d6ea2af3f9"} Dec 01 10:26:54 crc kubenswrapper[4867]: I1201 10:26:54.738217 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:26:55 crc kubenswrapper[4867]: I1201 10:26:55.114532 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lb8ng"] Dec 01 10:26:55 crc kubenswrapper[4867]: I1201 10:26:55.116408 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lb8ng" Dec 01 10:26:55 crc kubenswrapper[4867]: I1201 10:26:55.128570 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lb8ng"] Dec 01 10:26:55 crc kubenswrapper[4867]: I1201 10:26:55.138188 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31292a47-8a65-496c-8c64-77e7d155df0e-catalog-content\") pod \"redhat-marketplace-lb8ng\" (UID: \"31292a47-8a65-496c-8c64-77e7d155df0e\") " pod="openshift-marketplace/redhat-marketplace-lb8ng" Dec 01 10:26:55 crc kubenswrapper[4867]: I1201 10:26:55.138230 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdrq4\" (UniqueName: \"kubernetes.io/projected/31292a47-8a65-496c-8c64-77e7d155df0e-kube-api-access-zdrq4\") pod \"redhat-marketplace-lb8ng\" (UID: \"31292a47-8a65-496c-8c64-77e7d155df0e\") " pod="openshift-marketplace/redhat-marketplace-lb8ng" Dec 01 10:26:55 crc kubenswrapper[4867]: I1201 10:26:55.138256 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31292a47-8a65-496c-8c64-77e7d155df0e-utilities\") pod \"redhat-marketplace-lb8ng\" (UID: \"31292a47-8a65-496c-8c64-77e7d155df0e\") " pod="openshift-marketplace/redhat-marketplace-lb8ng" Dec 01 10:26:55 crc kubenswrapper[4867]: I1201 10:26:55.240174 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31292a47-8a65-496c-8c64-77e7d155df0e-catalog-content\") pod \"redhat-marketplace-lb8ng\" (UID: \"31292a47-8a65-496c-8c64-77e7d155df0e\") " pod="openshift-marketplace/redhat-marketplace-lb8ng" Dec 01 10:26:55 crc kubenswrapper[4867]: I1201 10:26:55.240229 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdrq4\" (UniqueName: \"kubernetes.io/projected/31292a47-8a65-496c-8c64-77e7d155df0e-kube-api-access-zdrq4\") pod \"redhat-marketplace-lb8ng\" (UID: \"31292a47-8a65-496c-8c64-77e7d155df0e\") " pod="openshift-marketplace/redhat-marketplace-lb8ng" Dec 01 10:26:55 crc kubenswrapper[4867]: I1201 10:26:55.240261 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31292a47-8a65-496c-8c64-77e7d155df0e-utilities\") pod \"redhat-marketplace-lb8ng\" (UID: \"31292a47-8a65-496c-8c64-77e7d155df0e\") " pod="openshift-marketplace/redhat-marketplace-lb8ng" Dec 01 10:26:55 crc kubenswrapper[4867]: I1201 10:26:55.240675 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31292a47-8a65-496c-8c64-77e7d155df0e-catalog-content\") pod \"redhat-marketplace-lb8ng\" (UID: \"31292a47-8a65-496c-8c64-77e7d155df0e\") " pod="openshift-marketplace/redhat-marketplace-lb8ng" Dec 01 10:26:55 crc kubenswrapper[4867]: I1201 10:26:55.240770 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31292a47-8a65-496c-8c64-77e7d155df0e-utilities\") pod \"redhat-marketplace-lb8ng\" (UID: \"31292a47-8a65-496c-8c64-77e7d155df0e\") " pod="openshift-marketplace/redhat-marketplace-lb8ng" Dec 01 10:26:55 crc kubenswrapper[4867]: I1201 10:26:55.267980 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdrq4\" (UniqueName: \"kubernetes.io/projected/31292a47-8a65-496c-8c64-77e7d155df0e-kube-api-access-zdrq4\") pod \"redhat-marketplace-lb8ng\" (UID: \"31292a47-8a65-496c-8c64-77e7d155df0e\") " pod="openshift-marketplace/redhat-marketplace-lb8ng" Dec 01 10:26:55 crc kubenswrapper[4867]: I1201 10:26:55.439251 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lb8ng" Dec 01 10:26:55 crc kubenswrapper[4867]: I1201 10:26:55.961705 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lb8ng"] Dec 01 10:26:56 crc kubenswrapper[4867]: I1201 10:26:56.758573 4867 generic.go:334] "Generic (PLEG): container finished" podID="31292a47-8a65-496c-8c64-77e7d155df0e" containerID="69e35278eaf7605464fd0f39de205076cc6afbb852e3c748b522e3ffb7f814c3" exitCode=0 Dec 01 10:26:56 crc kubenswrapper[4867]: I1201 10:26:56.758694 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb8ng" event={"ID":"31292a47-8a65-496c-8c64-77e7d155df0e","Type":"ContainerDied","Data":"69e35278eaf7605464fd0f39de205076cc6afbb852e3c748b522e3ffb7f814c3"} Dec 01 10:26:56 crc kubenswrapper[4867]: I1201 10:26:56.758985 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb8ng" event={"ID":"31292a47-8a65-496c-8c64-77e7d155df0e","Type":"ContainerStarted","Data":"57ffbd666c4b6399df0a8b6f27fc1175c12f7ea9b25a5b2ba0406a9529615c84"} Dec 01 10:26:56 crc kubenswrapper[4867]: I1201 10:26:56.761962 4867 generic.go:334] "Generic (PLEG): container finished" podID="35a899ec-abcc-42c5-9f2a-61a9e4f0c801" containerID="ad623ba31de3fae9c5ef633d781f438c83eb9272365816d015ffbe16ec9a987d" exitCode=0 Dec 01 10:26:56 crc kubenswrapper[4867]: I1201 10:26:56.762003 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrb9z" event={"ID":"35a899ec-abcc-42c5-9f2a-61a9e4f0c801","Type":"ContainerDied","Data":"ad623ba31de3fae9c5ef633d781f438c83eb9272365816d015ffbe16ec9a987d"} Dec 01 10:26:57 crc kubenswrapper[4867]: I1201 10:26:57.775537 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrb9z" event={"ID":"35a899ec-abcc-42c5-9f2a-61a9e4f0c801","Type":"ContainerStarted","Data":"7db717dae75c7ecd49e0a6e4c5dfaf2c03e540079a57ff74227cba744f7798ec"} Dec 01 10:26:57 crc kubenswrapper[4867]: I1201 10:26:57.782961 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb8ng" event={"ID":"31292a47-8a65-496c-8c64-77e7d155df0e","Type":"ContainerStarted","Data":"fcbea4f2389b544f7f5ce1764475971e17b37097d3ef4e57196dd79a342f9af5"} Dec 01 10:26:57 crc kubenswrapper[4867]: I1201 10:26:57.814036 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rrb9z" podStartSLOduration=3.243931637 podStartE2EDuration="5.814019158s" podCreationTimestamp="2025-12-01 10:26:52 +0000 UTC" firstStartedPulling="2025-12-01 10:26:54.737285273 +0000 UTC m=+4736.196672027" lastFinishedPulling="2025-12-01 10:26:57.307372784 +0000 UTC m=+4738.766759548" observedRunningTime="2025-12-01 10:26:57.802366729 +0000 UTC m=+4739.261753483" watchObservedRunningTime="2025-12-01 10:26:57.814019158 +0000 UTC m=+4739.273405912" Dec 01 10:26:58 crc kubenswrapper[4867]: I1201 10:26:58.793045 4867 generic.go:334] "Generic (PLEG): container finished" podID="31292a47-8a65-496c-8c64-77e7d155df0e" containerID="fcbea4f2389b544f7f5ce1764475971e17b37097d3ef4e57196dd79a342f9af5" exitCode=0 Dec 01 10:26:58 crc kubenswrapper[4867]: I1201 10:26:58.793140 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb8ng" event={"ID":"31292a47-8a65-496c-8c64-77e7d155df0e","Type":"ContainerDied","Data":"fcbea4f2389b544f7f5ce1764475971e17b37097d3ef4e57196dd79a342f9af5"} Dec 01 10:26:59 crc kubenswrapper[4867]: I1201 10:26:59.803699 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb8ng" event={"ID":"31292a47-8a65-496c-8c64-77e7d155df0e","Type":"ContainerStarted","Data":"84460ed8426b95c5818604ac6849a520de8ae1f6b81f1b5cbcdd2f7019dfb761"} Dec 01 10:26:59 crc kubenswrapper[4867]: I1201 10:26:59.832905 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lb8ng" podStartSLOduration=2.254480654 podStartE2EDuration="4.832887523s" podCreationTimestamp="2025-12-01 10:26:55 +0000 UTC" firstStartedPulling="2025-12-01 10:26:56.760360743 +0000 UTC m=+4738.219747497" lastFinishedPulling="2025-12-01 10:26:59.338767622 +0000 UTC m=+4740.798154366" observedRunningTime="2025-12-01 10:26:59.821019008 +0000 UTC m=+4741.280405792" watchObservedRunningTime="2025-12-01 10:26:59.832887523 +0000 UTC m=+4741.292274277" Dec 01 10:27:03 crc kubenswrapper[4867]: I1201 10:27:03.052427 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rrb9z" Dec 01 10:27:03 crc kubenswrapper[4867]: I1201 10:27:03.052993 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rrb9z" Dec 01 10:27:03 crc kubenswrapper[4867]: I1201 10:27:03.552462 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rrb9z" Dec 01 10:27:03 crc kubenswrapper[4867]: I1201 10:27:03.886868 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rrb9z" Dec 01 10:27:04 crc kubenswrapper[4867]: I1201 10:27:04.308496 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rrb9z"] Dec 01 10:27:05 crc kubenswrapper[4867]: I1201 10:27:05.439368 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lb8ng" Dec 01 10:27:05 crc kubenswrapper[4867]: I1201 10:27:05.440764 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lb8ng" Dec 01 10:27:05 crc kubenswrapper[4867]: I1201 10:27:05.497127 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lb8ng" Dec 01 10:27:05 crc kubenswrapper[4867]: I1201 10:27:05.874443 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rrb9z" podUID="35a899ec-abcc-42c5-9f2a-61a9e4f0c801" containerName="registry-server" containerID="cri-o://7db717dae75c7ecd49e0a6e4c5dfaf2c03e540079a57ff74227cba744f7798ec" gracePeriod=2 Dec 01 10:27:05 crc kubenswrapper[4867]: I1201 10:27:05.935628 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lb8ng" Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.478316 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rrb9z" Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.651700 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sddnt\" (UniqueName: \"kubernetes.io/projected/35a899ec-abcc-42c5-9f2a-61a9e4f0c801-kube-api-access-sddnt\") pod \"35a899ec-abcc-42c5-9f2a-61a9e4f0c801\" (UID: \"35a899ec-abcc-42c5-9f2a-61a9e4f0c801\") " Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.651970 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35a899ec-abcc-42c5-9f2a-61a9e4f0c801-utilities\") pod \"35a899ec-abcc-42c5-9f2a-61a9e4f0c801\" (UID: \"35a899ec-abcc-42c5-9f2a-61a9e4f0c801\") " Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.652027 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35a899ec-abcc-42c5-9f2a-61a9e4f0c801-catalog-content\") pod \"35a899ec-abcc-42c5-9f2a-61a9e4f0c801\" (UID: \"35a899ec-abcc-42c5-9f2a-61a9e4f0c801\") " Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.652951 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35a899ec-abcc-42c5-9f2a-61a9e4f0c801-utilities" (OuterVolumeSpecName: "utilities") pod "35a899ec-abcc-42c5-9f2a-61a9e4f0c801" (UID: "35a899ec-abcc-42c5-9f2a-61a9e4f0c801"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.660472 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35a899ec-abcc-42c5-9f2a-61a9e4f0c801-kube-api-access-sddnt" (OuterVolumeSpecName: "kube-api-access-sddnt") pod "35a899ec-abcc-42c5-9f2a-61a9e4f0c801" (UID: "35a899ec-abcc-42c5-9f2a-61a9e4f0c801"). InnerVolumeSpecName "kube-api-access-sddnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.708279 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lb8ng"] Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.709518 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35a899ec-abcc-42c5-9f2a-61a9e4f0c801-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35a899ec-abcc-42c5-9f2a-61a9e4f0c801" (UID: "35a899ec-abcc-42c5-9f2a-61a9e4f0c801"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.754744 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35a899ec-abcc-42c5-9f2a-61a9e4f0c801-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.754783 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35a899ec-abcc-42c5-9f2a-61a9e4f0c801-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.754797 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sddnt\" (UniqueName: \"kubernetes.io/projected/35a899ec-abcc-42c5-9f2a-61a9e4f0c801-kube-api-access-sddnt\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.884426 4867 generic.go:334] "Generic (PLEG): container finished" podID="35a899ec-abcc-42c5-9f2a-61a9e4f0c801" containerID="7db717dae75c7ecd49e0a6e4c5dfaf2c03e540079a57ff74227cba744f7798ec" exitCode=0 Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.884476 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrb9z" event={"ID":"35a899ec-abcc-42c5-9f2a-61a9e4f0c801","Type":"ContainerDied","Data":"7db717dae75c7ecd49e0a6e4c5dfaf2c03e540079a57ff74227cba744f7798ec"} Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.884509 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrb9z" event={"ID":"35a899ec-abcc-42c5-9f2a-61a9e4f0c801","Type":"ContainerDied","Data":"7f05c8e7cffbded9ede1a3abb36f3f297ac139d5bf1e5a17151434cbd4ce2e92"} Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.884527 4867 scope.go:117] "RemoveContainer" containerID="7db717dae75c7ecd49e0a6e4c5dfaf2c03e540079a57ff74227cba744f7798ec" Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.884533 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rrb9z" Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.904307 4867 scope.go:117] "RemoveContainer" containerID="ad623ba31de3fae9c5ef633d781f438c83eb9272365816d015ffbe16ec9a987d" Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.913598 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rrb9z"] Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.923285 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rrb9z"] Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.938938 4867 scope.go:117] "RemoveContainer" containerID="f20aacb4675766b0e996ebc084026415ae0c950dcc91f89c1299c8d6ea2af3f9" Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.978542 4867 scope.go:117] "RemoveContainer" containerID="7db717dae75c7ecd49e0a6e4c5dfaf2c03e540079a57ff74227cba744f7798ec" Dec 01 10:27:06 crc kubenswrapper[4867]: E1201 10:27:06.979437 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7db717dae75c7ecd49e0a6e4c5dfaf2c03e540079a57ff74227cba744f7798ec\": container with ID starting with 7db717dae75c7ecd49e0a6e4c5dfaf2c03e540079a57ff74227cba744f7798ec not found: ID does not exist" containerID="7db717dae75c7ecd49e0a6e4c5dfaf2c03e540079a57ff74227cba744f7798ec" Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.979477 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7db717dae75c7ecd49e0a6e4c5dfaf2c03e540079a57ff74227cba744f7798ec"} err="failed to get container status \"7db717dae75c7ecd49e0a6e4c5dfaf2c03e540079a57ff74227cba744f7798ec\": rpc error: code = NotFound desc = could not find container \"7db717dae75c7ecd49e0a6e4c5dfaf2c03e540079a57ff74227cba744f7798ec\": container with ID starting with 7db717dae75c7ecd49e0a6e4c5dfaf2c03e540079a57ff74227cba744f7798ec not found: ID does not exist" Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.979502 4867 scope.go:117] "RemoveContainer" containerID="ad623ba31de3fae9c5ef633d781f438c83eb9272365816d015ffbe16ec9a987d" Dec 01 10:27:06 crc kubenswrapper[4867]: E1201 10:27:06.979894 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad623ba31de3fae9c5ef633d781f438c83eb9272365816d015ffbe16ec9a987d\": container with ID starting with ad623ba31de3fae9c5ef633d781f438c83eb9272365816d015ffbe16ec9a987d not found: ID does not exist" containerID="ad623ba31de3fae9c5ef633d781f438c83eb9272365816d015ffbe16ec9a987d" Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.979923 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad623ba31de3fae9c5ef633d781f438c83eb9272365816d015ffbe16ec9a987d"} err="failed to get container status \"ad623ba31de3fae9c5ef633d781f438c83eb9272365816d015ffbe16ec9a987d\": rpc error: code = NotFound desc = could not find container \"ad623ba31de3fae9c5ef633d781f438c83eb9272365816d015ffbe16ec9a987d\": container with ID starting with ad623ba31de3fae9c5ef633d781f438c83eb9272365816d015ffbe16ec9a987d not found: ID does not exist" Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.979940 4867 scope.go:117] "RemoveContainer" containerID="f20aacb4675766b0e996ebc084026415ae0c950dcc91f89c1299c8d6ea2af3f9" Dec 01 10:27:06 crc kubenswrapper[4867]: E1201 10:27:06.980294 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f20aacb4675766b0e996ebc084026415ae0c950dcc91f89c1299c8d6ea2af3f9\": container with ID starting with f20aacb4675766b0e996ebc084026415ae0c950dcc91f89c1299c8d6ea2af3f9 not found: ID does not exist" containerID="f20aacb4675766b0e996ebc084026415ae0c950dcc91f89c1299c8d6ea2af3f9" Dec 01 10:27:06 crc kubenswrapper[4867]: I1201 10:27:06.980327 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f20aacb4675766b0e996ebc084026415ae0c950dcc91f89c1299c8d6ea2af3f9"} err="failed to get container status \"f20aacb4675766b0e996ebc084026415ae0c950dcc91f89c1299c8d6ea2af3f9\": rpc error: code = NotFound desc = could not find container \"f20aacb4675766b0e996ebc084026415ae0c950dcc91f89c1299c8d6ea2af3f9\": container with ID starting with f20aacb4675766b0e996ebc084026415ae0c950dcc91f89c1299c8d6ea2af3f9 not found: ID does not exist" Dec 01 10:27:07 crc kubenswrapper[4867]: I1201 10:27:07.892081 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lb8ng" podUID="31292a47-8a65-496c-8c64-77e7d155df0e" containerName="registry-server" containerID="cri-o://84460ed8426b95c5818604ac6849a520de8ae1f6b81f1b5cbcdd2f7019dfb761" gracePeriod=2 Dec 01 10:27:08 crc kubenswrapper[4867]: I1201 10:27:08.403561 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lb8ng" Dec 01 10:27:08 crc kubenswrapper[4867]: I1201 10:27:08.597925 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdrq4\" (UniqueName: \"kubernetes.io/projected/31292a47-8a65-496c-8c64-77e7d155df0e-kube-api-access-zdrq4\") pod \"31292a47-8a65-496c-8c64-77e7d155df0e\" (UID: \"31292a47-8a65-496c-8c64-77e7d155df0e\") " Dec 01 10:27:08 crc kubenswrapper[4867]: I1201 10:27:08.597986 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31292a47-8a65-496c-8c64-77e7d155df0e-utilities\") pod \"31292a47-8a65-496c-8c64-77e7d155df0e\" (UID: \"31292a47-8a65-496c-8c64-77e7d155df0e\") " Dec 01 10:27:08 crc kubenswrapper[4867]: I1201 10:27:08.598087 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31292a47-8a65-496c-8c64-77e7d155df0e-catalog-content\") pod \"31292a47-8a65-496c-8c64-77e7d155df0e\" (UID: \"31292a47-8a65-496c-8c64-77e7d155df0e\") " Dec 01 10:27:08 crc kubenswrapper[4867]: I1201 10:27:08.607574 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31292a47-8a65-496c-8c64-77e7d155df0e-utilities" (OuterVolumeSpecName: "utilities") pod "31292a47-8a65-496c-8c64-77e7d155df0e" (UID: "31292a47-8a65-496c-8c64-77e7d155df0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:27:08 crc kubenswrapper[4867]: I1201 10:27:08.620769 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31292a47-8a65-496c-8c64-77e7d155df0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31292a47-8a65-496c-8c64-77e7d155df0e" (UID: "31292a47-8a65-496c-8c64-77e7d155df0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:27:08 crc kubenswrapper[4867]: I1201 10:27:08.629025 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31292a47-8a65-496c-8c64-77e7d155df0e-kube-api-access-zdrq4" (OuterVolumeSpecName: "kube-api-access-zdrq4") pod "31292a47-8a65-496c-8c64-77e7d155df0e" (UID: "31292a47-8a65-496c-8c64-77e7d155df0e"). InnerVolumeSpecName "kube-api-access-zdrq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:27:08 crc kubenswrapper[4867]: I1201 10:27:08.700832 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdrq4\" (UniqueName: \"kubernetes.io/projected/31292a47-8a65-496c-8c64-77e7d155df0e-kube-api-access-zdrq4\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:08 crc kubenswrapper[4867]: I1201 10:27:08.700874 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31292a47-8a65-496c-8c64-77e7d155df0e-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:08 crc kubenswrapper[4867]: I1201 10:27:08.700884 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31292a47-8a65-496c-8c64-77e7d155df0e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:08 crc kubenswrapper[4867]: I1201 10:27:08.844310 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35a899ec-abcc-42c5-9f2a-61a9e4f0c801" path="/var/lib/kubelet/pods/35a899ec-abcc-42c5-9f2a-61a9e4f0c801/volumes" Dec 01 10:27:08 crc kubenswrapper[4867]: I1201 10:27:08.903125 4867 generic.go:334] "Generic (PLEG): container finished" podID="31292a47-8a65-496c-8c64-77e7d155df0e" containerID="84460ed8426b95c5818604ac6849a520de8ae1f6b81f1b5cbcdd2f7019dfb761" exitCode=0 Dec 01 10:27:08 crc kubenswrapper[4867]: I1201 10:27:08.903190 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb8ng" event={"ID":"31292a47-8a65-496c-8c64-77e7d155df0e","Type":"ContainerDied","Data":"84460ed8426b95c5818604ac6849a520de8ae1f6b81f1b5cbcdd2f7019dfb761"} Dec 01 10:27:08 crc kubenswrapper[4867]: I1201 10:27:08.903224 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb8ng" event={"ID":"31292a47-8a65-496c-8c64-77e7d155df0e","Type":"ContainerDied","Data":"57ffbd666c4b6399df0a8b6f27fc1175c12f7ea9b25a5b2ba0406a9529615c84"} Dec 01 10:27:08 crc kubenswrapper[4867]: I1201 10:27:08.903246 4867 scope.go:117] "RemoveContainer" containerID="84460ed8426b95c5818604ac6849a520de8ae1f6b81f1b5cbcdd2f7019dfb761" Dec 01 10:27:08 crc kubenswrapper[4867]: I1201 10:27:08.903194 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lb8ng" Dec 01 10:27:08 crc kubenswrapper[4867]: I1201 10:27:08.930850 4867 scope.go:117] "RemoveContainer" containerID="fcbea4f2389b544f7f5ce1764475971e17b37097d3ef4e57196dd79a342f9af5" Dec 01 10:27:08 crc kubenswrapper[4867]: I1201 10:27:08.943173 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lb8ng"] Dec 01 10:27:08 crc kubenswrapper[4867]: I1201 10:27:08.972135 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lb8ng"] Dec 01 10:27:08 crc kubenswrapper[4867]: I1201 10:27:08.973740 4867 scope.go:117] "RemoveContainer" containerID="69e35278eaf7605464fd0f39de205076cc6afbb852e3c748b522e3ffb7f814c3" Dec 01 10:27:09 crc kubenswrapper[4867]: I1201 10:27:09.021030 4867 scope.go:117] "RemoveContainer" containerID="84460ed8426b95c5818604ac6849a520de8ae1f6b81f1b5cbcdd2f7019dfb761" Dec 01 10:27:09 crc kubenswrapper[4867]: E1201 10:27:09.021552 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84460ed8426b95c5818604ac6849a520de8ae1f6b81f1b5cbcdd2f7019dfb761\": container with ID starting with 84460ed8426b95c5818604ac6849a520de8ae1f6b81f1b5cbcdd2f7019dfb761 not found: ID does not exist" containerID="84460ed8426b95c5818604ac6849a520de8ae1f6b81f1b5cbcdd2f7019dfb761" Dec 01 10:27:09 crc kubenswrapper[4867]: I1201 10:27:09.021590 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84460ed8426b95c5818604ac6849a520de8ae1f6b81f1b5cbcdd2f7019dfb761"} err="failed to get container status \"84460ed8426b95c5818604ac6849a520de8ae1f6b81f1b5cbcdd2f7019dfb761\": rpc error: code = NotFound desc = could not find container \"84460ed8426b95c5818604ac6849a520de8ae1f6b81f1b5cbcdd2f7019dfb761\": container with ID starting with 84460ed8426b95c5818604ac6849a520de8ae1f6b81f1b5cbcdd2f7019dfb761 not found: ID does not exist" Dec 01 10:27:09 crc kubenswrapper[4867]: I1201 10:27:09.021619 4867 scope.go:117] "RemoveContainer" containerID="fcbea4f2389b544f7f5ce1764475971e17b37097d3ef4e57196dd79a342f9af5" Dec 01 10:27:09 crc kubenswrapper[4867]: E1201 10:27:09.021987 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcbea4f2389b544f7f5ce1764475971e17b37097d3ef4e57196dd79a342f9af5\": container with ID starting with fcbea4f2389b544f7f5ce1764475971e17b37097d3ef4e57196dd79a342f9af5 not found: ID does not exist" containerID="fcbea4f2389b544f7f5ce1764475971e17b37097d3ef4e57196dd79a342f9af5" Dec 01 10:27:09 crc kubenswrapper[4867]: I1201 10:27:09.022123 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcbea4f2389b544f7f5ce1764475971e17b37097d3ef4e57196dd79a342f9af5"} err="failed to get container status \"fcbea4f2389b544f7f5ce1764475971e17b37097d3ef4e57196dd79a342f9af5\": rpc error: code = NotFound desc = could not find container \"fcbea4f2389b544f7f5ce1764475971e17b37097d3ef4e57196dd79a342f9af5\": container with ID starting with fcbea4f2389b544f7f5ce1764475971e17b37097d3ef4e57196dd79a342f9af5 not found: ID does not exist" Dec 01 10:27:09 crc kubenswrapper[4867]: I1201 10:27:09.022217 4867 scope.go:117] "RemoveContainer" containerID="69e35278eaf7605464fd0f39de205076cc6afbb852e3c748b522e3ffb7f814c3" Dec 01 10:27:09 crc kubenswrapper[4867]: E1201 10:27:09.022528 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69e35278eaf7605464fd0f39de205076cc6afbb852e3c748b522e3ffb7f814c3\": container with ID starting with 69e35278eaf7605464fd0f39de205076cc6afbb852e3c748b522e3ffb7f814c3 not found: ID does not exist" containerID="69e35278eaf7605464fd0f39de205076cc6afbb852e3c748b522e3ffb7f814c3" Dec 01 10:27:09 crc kubenswrapper[4867]: I1201 10:27:09.022604 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e35278eaf7605464fd0f39de205076cc6afbb852e3c748b522e3ffb7f814c3"} err="failed to get container status \"69e35278eaf7605464fd0f39de205076cc6afbb852e3c748b522e3ffb7f814c3\": rpc error: code = NotFound desc = could not find container \"69e35278eaf7605464fd0f39de205076cc6afbb852e3c748b522e3ffb7f814c3\": container with ID starting with 69e35278eaf7605464fd0f39de205076cc6afbb852e3c748b522e3ffb7f814c3 not found: ID does not exist" Dec 01 10:27:10 crc kubenswrapper[4867]: I1201 10:27:10.839825 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31292a47-8a65-496c-8c64-77e7d155df0e" path="/var/lib/kubelet/pods/31292a47-8a65-496c-8c64-77e7d155df0e/volumes" Dec 01 10:27:21 crc kubenswrapper[4867]: I1201 10:27:21.601924 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:27:21 crc kubenswrapper[4867]: I1201 10:27:21.602502 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:27:22 crc kubenswrapper[4867]: I1201 10:27:22.558034 4867 scope.go:117] "RemoveContainer" containerID="574fddfecc0241f75a058f7ce1493cdc252b22222afc3fdface89f94cedc8d4b" Dec 01 10:27:22 crc kubenswrapper[4867]: I1201 10:27:22.583996 4867 scope.go:117] "RemoveContainer" containerID="59267a3447189b12e345840903dae805d568068bd132c36932df4974776bf640" Dec 01 10:27:22 crc kubenswrapper[4867]: I1201 10:27:22.620336 4867 scope.go:117] "RemoveContainer" containerID="df02eda5a60dd33ac4daaed9ca5c42f7794ecfff47c23e73e6529b7e271f3aa0" Dec 01 10:27:33 crc kubenswrapper[4867]: I1201 10:27:33.112624 4867 generic.go:334] "Generic (PLEG): container finished" podID="31b3d747-c383-483d-8919-be1dd3a266b6" containerID="f948a94cd1b99d4df29f081e26ec442a368417e18fba05cb3ed8073b8018f431" exitCode=0 Dec 01 10:27:33 crc kubenswrapper[4867]: I1201 10:27:33.112726 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"31b3d747-c383-483d-8919-be1dd3a266b6","Type":"ContainerDied","Data":"f948a94cd1b99d4df29f081e26ec442a368417e18fba05cb3ed8073b8018f431"} Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.536245 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.731449 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31b3d747-c383-483d-8919-be1dd3a266b6-ssh-key\") pod \"31b3d747-c383-483d-8919-be1dd3a266b6\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.731492 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"31b3d747-c383-483d-8919-be1dd3a266b6\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.731549 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/31b3d747-c383-483d-8919-be1dd3a266b6-ca-certs\") pod \"31b3d747-c383-483d-8919-be1dd3a266b6\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.731602 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/31b3d747-c383-483d-8919-be1dd3a266b6-test-operator-ephemeral-temporary\") pod \"31b3d747-c383-483d-8919-be1dd3a266b6\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.731634 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/31b3d747-c383-483d-8919-be1dd3a266b6-openstack-config\") pod \"31b3d747-c383-483d-8919-be1dd3a266b6\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.731660 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt9vj\" (UniqueName: \"kubernetes.io/projected/31b3d747-c383-483d-8919-be1dd3a266b6-kube-api-access-xt9vj\") pod \"31b3d747-c383-483d-8919-be1dd3a266b6\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.731687 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/31b3d747-c383-483d-8919-be1dd3a266b6-test-operator-ephemeral-workdir\") pod \"31b3d747-c383-483d-8919-be1dd3a266b6\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.731898 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/31b3d747-c383-483d-8919-be1dd3a266b6-openstack-config-secret\") pod \"31b3d747-c383-483d-8919-be1dd3a266b6\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.731920 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31b3d747-c383-483d-8919-be1dd3a266b6-config-data\") pod \"31b3d747-c383-483d-8919-be1dd3a266b6\" (UID: \"31b3d747-c383-483d-8919-be1dd3a266b6\") " Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.741598 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31b3d747-c383-483d-8919-be1dd3a266b6-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "31b3d747-c383-483d-8919-be1dd3a266b6" (UID: "31b3d747-c383-483d-8919-be1dd3a266b6"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.742587 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31b3d747-c383-483d-8919-be1dd3a266b6-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "31b3d747-c383-483d-8919-be1dd3a266b6" (UID: "31b3d747-c383-483d-8919-be1dd3a266b6"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.743742 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31b3d747-c383-483d-8919-be1dd3a266b6-config-data" (OuterVolumeSpecName: "config-data") pod "31b3d747-c383-483d-8919-be1dd3a266b6" (UID: "31b3d747-c383-483d-8919-be1dd3a266b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.772888 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "31b3d747-c383-483d-8919-be1dd3a266b6" (UID: "31b3d747-c383-483d-8919-be1dd3a266b6"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.783852 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31b3d747-c383-483d-8919-be1dd3a266b6-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "31b3d747-c383-483d-8919-be1dd3a266b6" (UID: "31b3d747-c383-483d-8919-be1dd3a266b6"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.798431 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31b3d747-c383-483d-8919-be1dd3a266b6-kube-api-access-xt9vj" (OuterVolumeSpecName: "kube-api-access-xt9vj") pod "31b3d747-c383-483d-8919-be1dd3a266b6" (UID: "31b3d747-c383-483d-8919-be1dd3a266b6"). InnerVolumeSpecName "kube-api-access-xt9vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.807014 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31b3d747-c383-483d-8919-be1dd3a266b6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "31b3d747-c383-483d-8919-be1dd3a266b6" (UID: "31b3d747-c383-483d-8919-be1dd3a266b6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.831114 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31b3d747-c383-483d-8919-be1dd3a266b6-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "31b3d747-c383-483d-8919-be1dd3a266b6" (UID: "31b3d747-c383-483d-8919-be1dd3a266b6"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.836341 4867 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/31b3d747-c383-483d-8919-be1dd3a266b6-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.836369 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31b3d747-c383-483d-8919-be1dd3a266b6-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.836383 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31b3d747-c383-483d-8919-be1dd3a266b6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.837363 4867 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.837380 4867 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/31b3d747-c383-483d-8919-be1dd3a266b6-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.837394 4867 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/31b3d747-c383-483d-8919-be1dd3a266b6-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.837408 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt9vj\" (UniqueName: \"kubernetes.io/projected/31b3d747-c383-483d-8919-be1dd3a266b6-kube-api-access-xt9vj\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.837420 4867 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/31b3d747-c383-483d-8919-be1dd3a266b6-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.858471 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31b3d747-c383-483d-8919-be1dd3a266b6-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "31b3d747-c383-483d-8919-be1dd3a266b6" (UID: "31b3d747-c383-483d-8919-be1dd3a266b6"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.876387 4867 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.941351 4867 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:34 crc kubenswrapper[4867]: I1201 10:27:34.941394 4867 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/31b3d747-c383-483d-8919-be1dd3a266b6-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:35 crc kubenswrapper[4867]: I1201 10:27:35.134589 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"31b3d747-c383-483d-8919-be1dd3a266b6","Type":"ContainerDied","Data":"bb545976072e04bbb374e034523593a4164c975cbe7afcf61c472ba7282403d7"} Dec 01 10:27:35 crc kubenswrapper[4867]: I1201 10:27:35.134634 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 10:27:35 crc kubenswrapper[4867]: I1201 10:27:35.134639 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb545976072e04bbb374e034523593a4164c975cbe7afcf61c472ba7282403d7" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.551944 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 10:27:38 crc kubenswrapper[4867]: E1201 10:27:38.552600 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a899ec-abcc-42c5-9f2a-61a9e4f0c801" containerName="extract-utilities" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.552614 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a899ec-abcc-42c5-9f2a-61a9e4f0c801" containerName="extract-utilities" Dec 01 10:27:38 crc kubenswrapper[4867]: E1201 10:27:38.552638 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31b3d747-c383-483d-8919-be1dd3a266b6" containerName="tempest-tests-tempest-tests-runner" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.552645 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="31b3d747-c383-483d-8919-be1dd3a266b6" containerName="tempest-tests-tempest-tests-runner" Dec 01 10:27:38 crc kubenswrapper[4867]: E1201 10:27:38.552653 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31292a47-8a65-496c-8c64-77e7d155df0e" containerName="extract-content" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.552659 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="31292a47-8a65-496c-8c64-77e7d155df0e" containerName="extract-content" Dec 01 10:27:38 crc kubenswrapper[4867]: E1201 10:27:38.552671 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31292a47-8a65-496c-8c64-77e7d155df0e" containerName="extract-utilities" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.552677 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="31292a47-8a65-496c-8c64-77e7d155df0e" containerName="extract-utilities" Dec 01 10:27:38 crc kubenswrapper[4867]: E1201 10:27:38.552689 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31292a47-8a65-496c-8c64-77e7d155df0e" containerName="registry-server" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.552695 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="31292a47-8a65-496c-8c64-77e7d155df0e" containerName="registry-server" Dec 01 10:27:38 crc kubenswrapper[4867]: E1201 10:27:38.552722 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a899ec-abcc-42c5-9f2a-61a9e4f0c801" containerName="extract-content" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.552729 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a899ec-abcc-42c5-9f2a-61a9e4f0c801" containerName="extract-content" Dec 01 10:27:38 crc kubenswrapper[4867]: E1201 10:27:38.552744 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a899ec-abcc-42c5-9f2a-61a9e4f0c801" containerName="registry-server" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.552750 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a899ec-abcc-42c5-9f2a-61a9e4f0c801" containerName="registry-server" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.554997 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="31b3d747-c383-483d-8919-be1dd3a266b6" containerName="tempest-tests-tempest-tests-runner" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.555048 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="35a899ec-abcc-42c5-9f2a-61a9e4f0c801" containerName="registry-server" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.555060 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="31292a47-8a65-496c-8c64-77e7d155df0e" containerName="registry-server" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.555904 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.559528 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-77hwz" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.571494 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.619648 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pktm8\" (UniqueName: \"kubernetes.io/projected/640e57ba-3d94-41be-bcd7-0c5eeff8a092-kube-api-access-pktm8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"640e57ba-3d94-41be-bcd7-0c5eeff8a092\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.620038 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"640e57ba-3d94-41be-bcd7-0c5eeff8a092\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.722686 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pktm8\" (UniqueName: \"kubernetes.io/projected/640e57ba-3d94-41be-bcd7-0c5eeff8a092-kube-api-access-pktm8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"640e57ba-3d94-41be-bcd7-0c5eeff8a092\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.722730 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"640e57ba-3d94-41be-bcd7-0c5eeff8a092\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.723598 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"640e57ba-3d94-41be-bcd7-0c5eeff8a092\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.742836 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pktm8\" (UniqueName: \"kubernetes.io/projected/640e57ba-3d94-41be-bcd7-0c5eeff8a092-kube-api-access-pktm8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"640e57ba-3d94-41be-bcd7-0c5eeff8a092\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.749060 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"640e57ba-3d94-41be-bcd7-0c5eeff8a092\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:27:38 crc kubenswrapper[4867]: I1201 10:27:38.878152 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:27:39 crc kubenswrapper[4867]: I1201 10:27:39.365377 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 10:27:40 crc kubenswrapper[4867]: I1201 10:27:40.184214 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"640e57ba-3d94-41be-bcd7-0c5eeff8a092","Type":"ContainerStarted","Data":"70e91033418a3a6df58b363ab49fe41236406f7967537448fd74003055f18649"} Dec 01 10:27:44 crc kubenswrapper[4867]: I1201 10:27:44.219670 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"640e57ba-3d94-41be-bcd7-0c5eeff8a092","Type":"ContainerStarted","Data":"a7c12b49ca128e3768fb3738abb9e8100752011aea62d9bbb309bde469caf7f5"} Dec 01 10:27:51 crc kubenswrapper[4867]: I1201 10:27:51.601845 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:27:51 crc kubenswrapper[4867]: I1201 10:27:51.602893 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:28:09 crc kubenswrapper[4867]: I1201 10:28:09.672893 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=27.26317944 podStartE2EDuration="31.672872594s" podCreationTimestamp="2025-12-01 10:27:38 +0000 UTC" firstStartedPulling="2025-12-01 10:27:39.378985669 +0000 UTC m=+4780.838372423" lastFinishedPulling="2025-12-01 10:27:43.788678823 +0000 UTC m=+4785.248065577" observedRunningTime="2025-12-01 10:27:44.241473701 +0000 UTC m=+4785.700860455" watchObservedRunningTime="2025-12-01 10:28:09.672872594 +0000 UTC m=+4811.132259338" Dec 01 10:28:09 crc kubenswrapper[4867]: I1201 10:28:09.677369 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j9btd/must-gather-cgj5v"] Dec 01 10:28:09 crc kubenswrapper[4867]: I1201 10:28:09.679086 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j9btd/must-gather-cgj5v" Dec 01 10:28:09 crc kubenswrapper[4867]: I1201 10:28:09.680782 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-j9btd"/"default-dockercfg-69nvw" Dec 01 10:28:09 crc kubenswrapper[4867]: I1201 10:28:09.681198 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-j9btd"/"kube-root-ca.crt" Dec 01 10:28:09 crc kubenswrapper[4867]: I1201 10:28:09.682005 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-j9btd"/"openshift-service-ca.crt" Dec 01 10:28:09 crc kubenswrapper[4867]: I1201 10:28:09.693040 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j9btd/must-gather-cgj5v"] Dec 01 10:28:09 crc kubenswrapper[4867]: I1201 10:28:09.717137 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/085eeb7f-e0e8-4677-a788-4c1ba9fba5f0-must-gather-output\") pod \"must-gather-cgj5v\" (UID: \"085eeb7f-e0e8-4677-a788-4c1ba9fba5f0\") " pod="openshift-must-gather-j9btd/must-gather-cgj5v" Dec 01 10:28:09 crc kubenswrapper[4867]: I1201 10:28:09.717225 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p4ds\" (UniqueName: \"kubernetes.io/projected/085eeb7f-e0e8-4677-a788-4c1ba9fba5f0-kube-api-access-5p4ds\") pod \"must-gather-cgj5v\" (UID: \"085eeb7f-e0e8-4677-a788-4c1ba9fba5f0\") " pod="openshift-must-gather-j9btd/must-gather-cgj5v" Dec 01 10:28:09 crc kubenswrapper[4867]: I1201 10:28:09.819076 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/085eeb7f-e0e8-4677-a788-4c1ba9fba5f0-must-gather-output\") pod \"must-gather-cgj5v\" (UID: \"085eeb7f-e0e8-4677-a788-4c1ba9fba5f0\") " pod="openshift-must-gather-j9btd/must-gather-cgj5v" Dec 01 10:28:09 crc kubenswrapper[4867]: I1201 10:28:09.819707 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/085eeb7f-e0e8-4677-a788-4c1ba9fba5f0-must-gather-output\") pod \"must-gather-cgj5v\" (UID: \"085eeb7f-e0e8-4677-a788-4c1ba9fba5f0\") " pod="openshift-must-gather-j9btd/must-gather-cgj5v" Dec 01 10:28:09 crc kubenswrapper[4867]: I1201 10:28:09.819160 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p4ds\" (UniqueName: \"kubernetes.io/projected/085eeb7f-e0e8-4677-a788-4c1ba9fba5f0-kube-api-access-5p4ds\") pod \"must-gather-cgj5v\" (UID: \"085eeb7f-e0e8-4677-a788-4c1ba9fba5f0\") " pod="openshift-must-gather-j9btd/must-gather-cgj5v" Dec 01 10:28:09 crc kubenswrapper[4867]: I1201 10:28:09.842197 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p4ds\" (UniqueName: \"kubernetes.io/projected/085eeb7f-e0e8-4677-a788-4c1ba9fba5f0-kube-api-access-5p4ds\") pod \"must-gather-cgj5v\" (UID: \"085eeb7f-e0e8-4677-a788-4c1ba9fba5f0\") " pod="openshift-must-gather-j9btd/must-gather-cgj5v" Dec 01 10:28:09 crc kubenswrapper[4867]: I1201 10:28:09.997349 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j9btd/must-gather-cgj5v" Dec 01 10:28:11 crc kubenswrapper[4867]: I1201 10:28:11.228698 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j9btd/must-gather-cgj5v"] Dec 01 10:28:11 crc kubenswrapper[4867]: I1201 10:28:11.486766 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j9btd/must-gather-cgj5v" event={"ID":"085eeb7f-e0e8-4677-a788-4c1ba9fba5f0","Type":"ContainerStarted","Data":"09646ed519acab12658414f70a45aa1f28bb62e5deedc475424994531df01356"} Dec 01 10:28:17 crc kubenswrapper[4867]: I1201 10:28:17.601516 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j9btd/must-gather-cgj5v" event={"ID":"085eeb7f-e0e8-4677-a788-4c1ba9fba5f0","Type":"ContainerStarted","Data":"4fd34707c2d0b9d1c7091d6e1521aeacc2feea348c9d07360cc48c6814dd70d9"} Dec 01 10:28:17 crc kubenswrapper[4867]: I1201 10:28:17.602183 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j9btd/must-gather-cgj5v" event={"ID":"085eeb7f-e0e8-4677-a788-4c1ba9fba5f0","Type":"ContainerStarted","Data":"e60205b6321c9ed622744c00e57010e45a49887c0d2b638850971c7ffb5f5c2f"} Dec 01 10:28:17 crc kubenswrapper[4867]: I1201 10:28:17.626437 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-j9btd/must-gather-cgj5v" podStartSLOduration=3.4804652640000002 podStartE2EDuration="8.626421014s" podCreationTimestamp="2025-12-01 10:28:09 +0000 UTC" firstStartedPulling="2025-12-01 10:28:11.236398841 +0000 UTC m=+4812.695785595" lastFinishedPulling="2025-12-01 10:28:16.382354591 +0000 UTC m=+4817.841741345" observedRunningTime="2025-12-01 10:28:17.621950322 +0000 UTC m=+4819.081337076" watchObservedRunningTime="2025-12-01 10:28:17.626421014 +0000 UTC m=+4819.085807768" Dec 01 10:28:21 crc kubenswrapper[4867]: I1201 10:28:21.601476 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:28:21 crc kubenswrapper[4867]: I1201 10:28:21.602077 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:28:21 crc kubenswrapper[4867]: I1201 10:28:21.602121 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 10:28:21 crc kubenswrapper[4867]: I1201 10:28:21.602878 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d5239f7b77b22875fa81a2f4170c3ec389ab40f1ee4ab2299324fc1ac724ac7b"} pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:28:21 crc kubenswrapper[4867]: I1201 10:28:21.602935 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" containerID="cri-o://d5239f7b77b22875fa81a2f4170c3ec389ab40f1ee4ab2299324fc1ac724ac7b" gracePeriod=600 Dec 01 10:28:22 crc kubenswrapper[4867]: I1201 10:28:22.223972 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j9btd/crc-debug-4bw2s"] Dec 01 10:28:22 crc kubenswrapper[4867]: I1201 10:28:22.225892 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j9btd/crc-debug-4bw2s" Dec 01 10:28:22 crc kubenswrapper[4867]: I1201 10:28:22.282803 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4fw4\" (UniqueName: \"kubernetes.io/projected/bccf37bf-5f4e-4d5a-b610-3643bd2a58cf-kube-api-access-h4fw4\") pod \"crc-debug-4bw2s\" (UID: \"bccf37bf-5f4e-4d5a-b610-3643bd2a58cf\") " pod="openshift-must-gather-j9btd/crc-debug-4bw2s" Dec 01 10:28:22 crc kubenswrapper[4867]: I1201 10:28:22.282895 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bccf37bf-5f4e-4d5a-b610-3643bd2a58cf-host\") pod \"crc-debug-4bw2s\" (UID: \"bccf37bf-5f4e-4d5a-b610-3643bd2a58cf\") " pod="openshift-must-gather-j9btd/crc-debug-4bw2s" Dec 01 10:28:22 crc kubenswrapper[4867]: I1201 10:28:22.384673 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4fw4\" (UniqueName: \"kubernetes.io/projected/bccf37bf-5f4e-4d5a-b610-3643bd2a58cf-kube-api-access-h4fw4\") pod \"crc-debug-4bw2s\" (UID: \"bccf37bf-5f4e-4d5a-b610-3643bd2a58cf\") " pod="openshift-must-gather-j9btd/crc-debug-4bw2s" Dec 01 10:28:22 crc kubenswrapper[4867]: I1201 10:28:22.384750 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bccf37bf-5f4e-4d5a-b610-3643bd2a58cf-host\") pod \"crc-debug-4bw2s\" (UID: \"bccf37bf-5f4e-4d5a-b610-3643bd2a58cf\") " pod="openshift-must-gather-j9btd/crc-debug-4bw2s" Dec 01 10:28:22 crc kubenswrapper[4867]: I1201 10:28:22.384834 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bccf37bf-5f4e-4d5a-b610-3643bd2a58cf-host\") pod \"crc-debug-4bw2s\" (UID: \"bccf37bf-5f4e-4d5a-b610-3643bd2a58cf\") " pod="openshift-must-gather-j9btd/crc-debug-4bw2s" Dec 01 10:28:22 crc kubenswrapper[4867]: I1201 10:28:22.433796 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4fw4\" (UniqueName: \"kubernetes.io/projected/bccf37bf-5f4e-4d5a-b610-3643bd2a58cf-kube-api-access-h4fw4\") pod \"crc-debug-4bw2s\" (UID: \"bccf37bf-5f4e-4d5a-b610-3643bd2a58cf\") " pod="openshift-must-gather-j9btd/crc-debug-4bw2s" Dec 01 10:28:22 crc kubenswrapper[4867]: I1201 10:28:22.544112 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j9btd/crc-debug-4bw2s" Dec 01 10:28:22 crc kubenswrapper[4867]: W1201 10:28:22.592379 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbccf37bf_5f4e_4d5a_b610_3643bd2a58cf.slice/crio-1a8e9277d94da6155dc1c826d11ef2816307405e9fd9abab65f66152418051b0 WatchSource:0}: Error finding container 1a8e9277d94da6155dc1c826d11ef2816307405e9fd9abab65f66152418051b0: Status 404 returned error can't find the container with id 1a8e9277d94da6155dc1c826d11ef2816307405e9fd9abab65f66152418051b0 Dec 01 10:28:22 crc kubenswrapper[4867]: I1201 10:28:22.649354 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j9btd/crc-debug-4bw2s" event={"ID":"bccf37bf-5f4e-4d5a-b610-3643bd2a58cf","Type":"ContainerStarted","Data":"1a8e9277d94da6155dc1c826d11ef2816307405e9fd9abab65f66152418051b0"} Dec 01 10:28:22 crc kubenswrapper[4867]: I1201 10:28:22.660035 4867 generic.go:334] "Generic (PLEG): container finished" podID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerID="d5239f7b77b22875fa81a2f4170c3ec389ab40f1ee4ab2299324fc1ac724ac7b" exitCode=0 Dec 01 10:28:22 crc kubenswrapper[4867]: I1201 10:28:22.660099 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerDied","Data":"d5239f7b77b22875fa81a2f4170c3ec389ab40f1ee4ab2299324fc1ac724ac7b"} Dec 01 10:28:22 crc kubenswrapper[4867]: I1201 10:28:22.660142 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb"} Dec 01 10:28:22 crc kubenswrapper[4867]: I1201 10:28:22.660265 4867 scope.go:117] "RemoveContainer" containerID="9eb9d1c7ad1170a2d03f126376950d982950d3a94ec8f85fd8ed46c3c555336b" Dec 01 10:28:35 crc kubenswrapper[4867]: I1201 10:28:35.818922 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j9btd/crc-debug-4bw2s" event={"ID":"bccf37bf-5f4e-4d5a-b610-3643bd2a58cf","Type":"ContainerStarted","Data":"4fef6964d1a2d1bd1ffe2cd0ad922d180f3edd03365b2588b338ca8c1fb7dffd"} Dec 01 10:28:35 crc kubenswrapper[4867]: I1201 10:28:35.859109 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-j9btd/crc-debug-4bw2s" podStartSLOduration=1.852296719 podStartE2EDuration="13.859089324s" podCreationTimestamp="2025-12-01 10:28:22 +0000 UTC" firstStartedPulling="2025-12-01 10:28:22.595559399 +0000 UTC m=+4824.054946153" lastFinishedPulling="2025-12-01 10:28:34.602351994 +0000 UTC m=+4836.061738758" observedRunningTime="2025-12-01 10:28:35.838576031 +0000 UTC m=+4837.297962785" watchObservedRunningTime="2025-12-01 10:28:35.859089324 +0000 UTC m=+4837.318476078" Dec 01 10:29:25 crc kubenswrapper[4867]: I1201 10:29:25.264337 4867 generic.go:334] "Generic (PLEG): container finished" podID="bccf37bf-5f4e-4d5a-b610-3643bd2a58cf" containerID="4fef6964d1a2d1bd1ffe2cd0ad922d180f3edd03365b2588b338ca8c1fb7dffd" exitCode=0 Dec 01 10:29:25 crc kubenswrapper[4867]: I1201 10:29:25.264420 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j9btd/crc-debug-4bw2s" event={"ID":"bccf37bf-5f4e-4d5a-b610-3643bd2a58cf","Type":"ContainerDied","Data":"4fef6964d1a2d1bd1ffe2cd0ad922d180f3edd03365b2588b338ca8c1fb7dffd"} Dec 01 10:29:26 crc kubenswrapper[4867]: I1201 10:29:26.422259 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j9btd/crc-debug-4bw2s" Dec 01 10:29:26 crc kubenswrapper[4867]: I1201 10:29:26.464186 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-j9btd/crc-debug-4bw2s"] Dec 01 10:29:26 crc kubenswrapper[4867]: I1201 10:29:26.474010 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-j9btd/crc-debug-4bw2s"] Dec 01 10:29:26 crc kubenswrapper[4867]: I1201 10:29:26.570422 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bccf37bf-5f4e-4d5a-b610-3643bd2a58cf-host\") pod \"bccf37bf-5f4e-4d5a-b610-3643bd2a58cf\" (UID: \"bccf37bf-5f4e-4d5a-b610-3643bd2a58cf\") " Dec 01 10:29:26 crc kubenswrapper[4867]: I1201 10:29:26.570801 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4fw4\" (UniqueName: \"kubernetes.io/projected/bccf37bf-5f4e-4d5a-b610-3643bd2a58cf-kube-api-access-h4fw4\") pod \"bccf37bf-5f4e-4d5a-b610-3643bd2a58cf\" (UID: \"bccf37bf-5f4e-4d5a-b610-3643bd2a58cf\") " Dec 01 10:29:26 crc kubenswrapper[4867]: I1201 10:29:26.570559 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bccf37bf-5f4e-4d5a-b610-3643bd2a58cf-host" (OuterVolumeSpecName: "host") pod "bccf37bf-5f4e-4d5a-b610-3643bd2a58cf" (UID: "bccf37bf-5f4e-4d5a-b610-3643bd2a58cf"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:29:26 crc kubenswrapper[4867]: I1201 10:29:26.571389 4867 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bccf37bf-5f4e-4d5a-b610-3643bd2a58cf-host\") on node \"crc\" DevicePath \"\"" Dec 01 10:29:26 crc kubenswrapper[4867]: I1201 10:29:26.586057 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bccf37bf-5f4e-4d5a-b610-3643bd2a58cf-kube-api-access-h4fw4" (OuterVolumeSpecName: "kube-api-access-h4fw4") pod "bccf37bf-5f4e-4d5a-b610-3643bd2a58cf" (UID: "bccf37bf-5f4e-4d5a-b610-3643bd2a58cf"). InnerVolumeSpecName "kube-api-access-h4fw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:29:26 crc kubenswrapper[4867]: I1201 10:29:26.673779 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4fw4\" (UniqueName: \"kubernetes.io/projected/bccf37bf-5f4e-4d5a-b610-3643bd2a58cf-kube-api-access-h4fw4\") on node \"crc\" DevicePath \"\"" Dec 01 10:29:26 crc kubenswrapper[4867]: I1201 10:29:26.837978 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bccf37bf-5f4e-4d5a-b610-3643bd2a58cf" path="/var/lib/kubelet/pods/bccf37bf-5f4e-4d5a-b610-3643bd2a58cf/volumes" Dec 01 10:29:27 crc kubenswrapper[4867]: I1201 10:29:27.284391 4867 scope.go:117] "RemoveContainer" containerID="4fef6964d1a2d1bd1ffe2cd0ad922d180f3edd03365b2588b338ca8c1fb7dffd" Dec 01 10:29:27 crc kubenswrapper[4867]: I1201 10:29:27.284433 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j9btd/crc-debug-4bw2s" Dec 01 10:29:27 crc kubenswrapper[4867]: I1201 10:29:27.654398 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j9btd/crc-debug-dxv29"] Dec 01 10:29:27 crc kubenswrapper[4867]: E1201 10:29:27.655726 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bccf37bf-5f4e-4d5a-b610-3643bd2a58cf" containerName="container-00" Dec 01 10:29:27 crc kubenswrapper[4867]: I1201 10:29:27.655801 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="bccf37bf-5f4e-4d5a-b610-3643bd2a58cf" containerName="container-00" Dec 01 10:29:27 crc kubenswrapper[4867]: I1201 10:29:27.656104 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="bccf37bf-5f4e-4d5a-b610-3643bd2a58cf" containerName="container-00" Dec 01 10:29:27 crc kubenswrapper[4867]: I1201 10:29:27.657269 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j9btd/crc-debug-dxv29" Dec 01 10:29:27 crc kubenswrapper[4867]: I1201 10:29:27.691123 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f5f59af-6612-4699-b509-51f26f3c1274-host\") pod \"crc-debug-dxv29\" (UID: \"5f5f59af-6612-4699-b509-51f26f3c1274\") " pod="openshift-must-gather-j9btd/crc-debug-dxv29" Dec 01 10:29:27 crc kubenswrapper[4867]: I1201 10:29:27.691281 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdvgw\" (UniqueName: \"kubernetes.io/projected/5f5f59af-6612-4699-b509-51f26f3c1274-kube-api-access-pdvgw\") pod \"crc-debug-dxv29\" (UID: \"5f5f59af-6612-4699-b509-51f26f3c1274\") " pod="openshift-must-gather-j9btd/crc-debug-dxv29" Dec 01 10:29:27 crc kubenswrapper[4867]: I1201 10:29:27.792617 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f5f59af-6612-4699-b509-51f26f3c1274-host\") pod \"crc-debug-dxv29\" (UID: \"5f5f59af-6612-4699-b509-51f26f3c1274\") " pod="openshift-must-gather-j9btd/crc-debug-dxv29" Dec 01 10:29:27 crc kubenswrapper[4867]: I1201 10:29:27.792694 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdvgw\" (UniqueName: \"kubernetes.io/projected/5f5f59af-6612-4699-b509-51f26f3c1274-kube-api-access-pdvgw\") pod \"crc-debug-dxv29\" (UID: \"5f5f59af-6612-4699-b509-51f26f3c1274\") " pod="openshift-must-gather-j9btd/crc-debug-dxv29" Dec 01 10:29:27 crc kubenswrapper[4867]: I1201 10:29:27.792788 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f5f59af-6612-4699-b509-51f26f3c1274-host\") pod \"crc-debug-dxv29\" (UID: \"5f5f59af-6612-4699-b509-51f26f3c1274\") " pod="openshift-must-gather-j9btd/crc-debug-dxv29" Dec 01 10:29:27 crc kubenswrapper[4867]: I1201 10:29:27.824999 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdvgw\" (UniqueName: \"kubernetes.io/projected/5f5f59af-6612-4699-b509-51f26f3c1274-kube-api-access-pdvgw\") pod \"crc-debug-dxv29\" (UID: \"5f5f59af-6612-4699-b509-51f26f3c1274\") " pod="openshift-must-gather-j9btd/crc-debug-dxv29" Dec 01 10:29:27 crc kubenswrapper[4867]: I1201 10:29:27.974921 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j9btd/crc-debug-dxv29" Dec 01 10:29:28 crc kubenswrapper[4867]: I1201 10:29:28.294804 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j9btd/crc-debug-dxv29" event={"ID":"5f5f59af-6612-4699-b509-51f26f3c1274","Type":"ContainerStarted","Data":"bbeb71490ef30a7f3298901d003dca10d62fd1e19ef7d09c8f154e82c2fa643f"} Dec 01 10:29:28 crc kubenswrapper[4867]: I1201 10:29:28.295203 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j9btd/crc-debug-dxv29" event={"ID":"5f5f59af-6612-4699-b509-51f26f3c1274","Type":"ContainerStarted","Data":"f1c1e6f2ed3dd57e422d7f1b5365da3787fa80f4bc90ab0e774179d1aba08ed6"} Dec 01 10:29:28 crc kubenswrapper[4867]: I1201 10:29:28.313564 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-j9btd/crc-debug-dxv29" podStartSLOduration=1.313519802 podStartE2EDuration="1.313519802s" podCreationTimestamp="2025-12-01 10:29:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:29:28.309380819 +0000 UTC m=+4889.768767573" watchObservedRunningTime="2025-12-01 10:29:28.313519802 +0000 UTC m=+4889.772906566" Dec 01 10:29:29 crc kubenswrapper[4867]: I1201 10:29:29.316536 4867 generic.go:334] "Generic (PLEG): container finished" podID="5f5f59af-6612-4699-b509-51f26f3c1274" containerID="bbeb71490ef30a7f3298901d003dca10d62fd1e19ef7d09c8f154e82c2fa643f" exitCode=0 Dec 01 10:29:29 crc kubenswrapper[4867]: I1201 10:29:29.316891 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j9btd/crc-debug-dxv29" event={"ID":"5f5f59af-6612-4699-b509-51f26f3c1274","Type":"ContainerDied","Data":"bbeb71490ef30a7f3298901d003dca10d62fd1e19ef7d09c8f154e82c2fa643f"} Dec 01 10:29:30 crc kubenswrapper[4867]: I1201 10:29:30.421508 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j9btd/crc-debug-dxv29" Dec 01 10:29:30 crc kubenswrapper[4867]: I1201 10:29:30.443609 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f5f59af-6612-4699-b509-51f26f3c1274-host\") pod \"5f5f59af-6612-4699-b509-51f26f3c1274\" (UID: \"5f5f59af-6612-4699-b509-51f26f3c1274\") " Dec 01 10:29:30 crc kubenswrapper[4867]: I1201 10:29:30.443929 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f5f59af-6612-4699-b509-51f26f3c1274-host" (OuterVolumeSpecName: "host") pod "5f5f59af-6612-4699-b509-51f26f3c1274" (UID: "5f5f59af-6612-4699-b509-51f26f3c1274"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:29:30 crc kubenswrapper[4867]: I1201 10:29:30.444352 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdvgw\" (UniqueName: \"kubernetes.io/projected/5f5f59af-6612-4699-b509-51f26f3c1274-kube-api-access-pdvgw\") pod \"5f5f59af-6612-4699-b509-51f26f3c1274\" (UID: \"5f5f59af-6612-4699-b509-51f26f3c1274\") " Dec 01 10:29:30 crc kubenswrapper[4867]: I1201 10:29:30.445060 4867 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f5f59af-6612-4699-b509-51f26f3c1274-host\") on node \"crc\" DevicePath \"\"" Dec 01 10:29:30 crc kubenswrapper[4867]: I1201 10:29:30.451285 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f5f59af-6612-4699-b509-51f26f3c1274-kube-api-access-pdvgw" (OuterVolumeSpecName: "kube-api-access-pdvgw") pod "5f5f59af-6612-4699-b509-51f26f3c1274" (UID: "5f5f59af-6612-4699-b509-51f26f3c1274"). InnerVolumeSpecName "kube-api-access-pdvgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:29:30 crc kubenswrapper[4867]: I1201 10:29:30.546142 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdvgw\" (UniqueName: \"kubernetes.io/projected/5f5f59af-6612-4699-b509-51f26f3c1274-kube-api-access-pdvgw\") on node \"crc\" DevicePath \"\"" Dec 01 10:29:30 crc kubenswrapper[4867]: I1201 10:29:30.859326 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-j9btd/crc-debug-dxv29"] Dec 01 10:29:30 crc kubenswrapper[4867]: I1201 10:29:30.862056 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-j9btd/crc-debug-dxv29"] Dec 01 10:29:31 crc kubenswrapper[4867]: I1201 10:29:31.334820 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1c1e6f2ed3dd57e422d7f1b5365da3787fa80f4bc90ab0e774179d1aba08ed6" Dec 01 10:29:31 crc kubenswrapper[4867]: I1201 10:29:31.334879 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j9btd/crc-debug-dxv29" Dec 01 10:29:32 crc kubenswrapper[4867]: I1201 10:29:32.047283 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j9btd/crc-debug-6fc6j"] Dec 01 10:29:32 crc kubenswrapper[4867]: E1201 10:29:32.047933 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f5f59af-6612-4699-b509-51f26f3c1274" containerName="container-00" Dec 01 10:29:32 crc kubenswrapper[4867]: I1201 10:29:32.047948 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f5f59af-6612-4699-b509-51f26f3c1274" containerName="container-00" Dec 01 10:29:32 crc kubenswrapper[4867]: I1201 10:29:32.048152 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f5f59af-6612-4699-b509-51f26f3c1274" containerName="container-00" Dec 01 10:29:32 crc kubenswrapper[4867]: I1201 10:29:32.048729 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j9btd/crc-debug-6fc6j" Dec 01 10:29:32 crc kubenswrapper[4867]: I1201 10:29:32.175359 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65bd8f93-f2ed-4fe7-bd38-77a38821c8dc-host\") pod \"crc-debug-6fc6j\" (UID: \"65bd8f93-f2ed-4fe7-bd38-77a38821c8dc\") " pod="openshift-must-gather-j9btd/crc-debug-6fc6j" Dec 01 10:29:32 crc kubenswrapper[4867]: I1201 10:29:32.175407 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm5mt\" (UniqueName: \"kubernetes.io/projected/65bd8f93-f2ed-4fe7-bd38-77a38821c8dc-kube-api-access-vm5mt\") pod \"crc-debug-6fc6j\" (UID: \"65bd8f93-f2ed-4fe7-bd38-77a38821c8dc\") " pod="openshift-must-gather-j9btd/crc-debug-6fc6j" Dec 01 10:29:32 crc kubenswrapper[4867]: I1201 10:29:32.278119 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65bd8f93-f2ed-4fe7-bd38-77a38821c8dc-host\") pod \"crc-debug-6fc6j\" (UID: \"65bd8f93-f2ed-4fe7-bd38-77a38821c8dc\") " pod="openshift-must-gather-j9btd/crc-debug-6fc6j" Dec 01 10:29:32 crc kubenswrapper[4867]: I1201 10:29:32.278160 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm5mt\" (UniqueName: \"kubernetes.io/projected/65bd8f93-f2ed-4fe7-bd38-77a38821c8dc-kube-api-access-vm5mt\") pod \"crc-debug-6fc6j\" (UID: \"65bd8f93-f2ed-4fe7-bd38-77a38821c8dc\") " pod="openshift-must-gather-j9btd/crc-debug-6fc6j" Dec 01 10:29:32 crc kubenswrapper[4867]: I1201 10:29:32.278402 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65bd8f93-f2ed-4fe7-bd38-77a38821c8dc-host\") pod \"crc-debug-6fc6j\" (UID: \"65bd8f93-f2ed-4fe7-bd38-77a38821c8dc\") " pod="openshift-must-gather-j9btd/crc-debug-6fc6j" Dec 01 10:29:32 crc kubenswrapper[4867]: I1201 10:29:32.298045 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm5mt\" (UniqueName: \"kubernetes.io/projected/65bd8f93-f2ed-4fe7-bd38-77a38821c8dc-kube-api-access-vm5mt\") pod \"crc-debug-6fc6j\" (UID: \"65bd8f93-f2ed-4fe7-bd38-77a38821c8dc\") " pod="openshift-must-gather-j9btd/crc-debug-6fc6j" Dec 01 10:29:32 crc kubenswrapper[4867]: I1201 10:29:32.364589 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j9btd/crc-debug-6fc6j" Dec 01 10:29:32 crc kubenswrapper[4867]: W1201 10:29:32.391251 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65bd8f93_f2ed_4fe7_bd38_77a38821c8dc.slice/crio-c18c8e5a3085fc9ed63e5adf73c2cbe23339669fc5771df9b21a3901e95bad9b WatchSource:0}: Error finding container c18c8e5a3085fc9ed63e5adf73c2cbe23339669fc5771df9b21a3901e95bad9b: Status 404 returned error can't find the container with id c18c8e5a3085fc9ed63e5adf73c2cbe23339669fc5771df9b21a3901e95bad9b Dec 01 10:29:32 crc kubenswrapper[4867]: I1201 10:29:32.837485 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f5f59af-6612-4699-b509-51f26f3c1274" path="/var/lib/kubelet/pods/5f5f59af-6612-4699-b509-51f26f3c1274/volumes" Dec 01 10:29:33 crc kubenswrapper[4867]: I1201 10:29:33.351239 4867 generic.go:334] "Generic (PLEG): container finished" podID="65bd8f93-f2ed-4fe7-bd38-77a38821c8dc" containerID="9963ab33dae02eeef82b32f105a2d3aa8290a9359ec99469f220db806b825e24" exitCode=0 Dec 01 10:29:33 crc kubenswrapper[4867]: I1201 10:29:33.351417 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j9btd/crc-debug-6fc6j" event={"ID":"65bd8f93-f2ed-4fe7-bd38-77a38821c8dc","Type":"ContainerDied","Data":"9963ab33dae02eeef82b32f105a2d3aa8290a9359ec99469f220db806b825e24"} Dec 01 10:29:33 crc kubenswrapper[4867]: I1201 10:29:33.351508 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j9btd/crc-debug-6fc6j" event={"ID":"65bd8f93-f2ed-4fe7-bd38-77a38821c8dc","Type":"ContainerStarted","Data":"c18c8e5a3085fc9ed63e5adf73c2cbe23339669fc5771df9b21a3901e95bad9b"} Dec 01 10:29:33 crc kubenswrapper[4867]: I1201 10:29:33.383484 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-j9btd/crc-debug-6fc6j"] Dec 01 10:29:33 crc kubenswrapper[4867]: I1201 10:29:33.391472 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-j9btd/crc-debug-6fc6j"] Dec 01 10:29:34 crc kubenswrapper[4867]: I1201 10:29:34.722884 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j9btd/crc-debug-6fc6j" Dec 01 10:29:34 crc kubenswrapper[4867]: I1201 10:29:34.734331 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65bd8f93-f2ed-4fe7-bd38-77a38821c8dc-host\") pod \"65bd8f93-f2ed-4fe7-bd38-77a38821c8dc\" (UID: \"65bd8f93-f2ed-4fe7-bd38-77a38821c8dc\") " Dec 01 10:29:34 crc kubenswrapper[4867]: I1201 10:29:34.734477 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65bd8f93-f2ed-4fe7-bd38-77a38821c8dc-host" (OuterVolumeSpecName: "host") pod "65bd8f93-f2ed-4fe7-bd38-77a38821c8dc" (UID: "65bd8f93-f2ed-4fe7-bd38-77a38821c8dc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:29:34 crc kubenswrapper[4867]: I1201 10:29:34.734786 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm5mt\" (UniqueName: \"kubernetes.io/projected/65bd8f93-f2ed-4fe7-bd38-77a38821c8dc-kube-api-access-vm5mt\") pod \"65bd8f93-f2ed-4fe7-bd38-77a38821c8dc\" (UID: \"65bd8f93-f2ed-4fe7-bd38-77a38821c8dc\") " Dec 01 10:29:34 crc kubenswrapper[4867]: I1201 10:29:34.735465 4867 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65bd8f93-f2ed-4fe7-bd38-77a38821c8dc-host\") on node \"crc\" DevicePath \"\"" Dec 01 10:29:34 crc kubenswrapper[4867]: I1201 10:29:34.741236 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65bd8f93-f2ed-4fe7-bd38-77a38821c8dc-kube-api-access-vm5mt" (OuterVolumeSpecName: "kube-api-access-vm5mt") pod "65bd8f93-f2ed-4fe7-bd38-77a38821c8dc" (UID: "65bd8f93-f2ed-4fe7-bd38-77a38821c8dc"). InnerVolumeSpecName "kube-api-access-vm5mt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:29:34 crc kubenswrapper[4867]: I1201 10:29:34.836459 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm5mt\" (UniqueName: \"kubernetes.io/projected/65bd8f93-f2ed-4fe7-bd38-77a38821c8dc-kube-api-access-vm5mt\") on node \"crc\" DevicePath \"\"" Dec 01 10:29:34 crc kubenswrapper[4867]: I1201 10:29:34.840237 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65bd8f93-f2ed-4fe7-bd38-77a38821c8dc" path="/var/lib/kubelet/pods/65bd8f93-f2ed-4fe7-bd38-77a38821c8dc/volumes" Dec 01 10:29:35 crc kubenswrapper[4867]: I1201 10:29:35.381125 4867 scope.go:117] "RemoveContainer" containerID="9963ab33dae02eeef82b32f105a2d3aa8290a9359ec99469f220db806b825e24" Dec 01 10:29:35 crc kubenswrapper[4867]: I1201 10:29:35.381604 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j9btd/crc-debug-6fc6j" Dec 01 10:29:54 crc kubenswrapper[4867]: I1201 10:29:54.700430 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b97bc66cd-p4vv6_f948002f-f1df-40b5-8fcc-db28284c2609/barbican-api/0.log" Dec 01 10:29:54 crc kubenswrapper[4867]: I1201 10:29:54.783434 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b97bc66cd-p4vv6_f948002f-f1df-40b5-8fcc-db28284c2609/barbican-api-log/0.log" Dec 01 10:29:54 crc kubenswrapper[4867]: I1201 10:29:54.976405 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f9f6fdc98-l7cht_a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2/barbican-keystone-listener/0.log" Dec 01 10:29:55 crc kubenswrapper[4867]: I1201 10:29:55.051842 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65fbb9cf75-989xz_8469f9a0-94d4-4c2c-839a-80d619a2d984/barbican-worker/0.log" Dec 01 10:29:55 crc kubenswrapper[4867]: I1201 10:29:55.080711 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f9f6fdc98-l7cht_a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2/barbican-keystone-listener-log/0.log" Dec 01 10:29:55 crc kubenswrapper[4867]: I1201 10:29:55.298832 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65fbb9cf75-989xz_8469f9a0-94d4-4c2c-839a-80d619a2d984/barbican-worker-log/0.log" Dec 01 10:29:55 crc kubenswrapper[4867]: I1201 10:29:55.313109 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw_a32a973f-6473-444b-a71a-d848773d8de2/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:29:55 crc kubenswrapper[4867]: I1201 10:29:55.603946 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2699b818-66ce-4531-9084-e599305630ed/proxy-httpd/0.log" Dec 01 10:29:55 crc kubenswrapper[4867]: I1201 10:29:55.646966 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2699b818-66ce-4531-9084-e599305630ed/sg-core/0.log" Dec 01 10:29:55 crc kubenswrapper[4867]: I1201 10:29:55.683380 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2699b818-66ce-4531-9084-e599305630ed/ceilometer-notification-agent/0.log" Dec 01 10:29:55 crc kubenswrapper[4867]: I1201 10:29:55.689719 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2699b818-66ce-4531-9084-e599305630ed/ceilometer-central-agent/0.log" Dec 01 10:29:55 crc kubenswrapper[4867]: I1201 10:29:55.954640 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c71e5b77-e090-4fdd-a254-387c5f9c5fba/cinder-api-log/0.log" Dec 01 10:29:56 crc kubenswrapper[4867]: I1201 10:29:56.071382 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c71e5b77-e090-4fdd-a254-387c5f9c5fba/cinder-api/0.log" Dec 01 10:29:56 crc kubenswrapper[4867]: I1201 10:29:56.294206 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9fe6c397-9427-4440-9d14-b0397c62f8ea/cinder-scheduler/0.log" Dec 01 10:29:56 crc kubenswrapper[4867]: I1201 10:29:56.508878 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9fe6c397-9427-4440-9d14-b0397c62f8ea/probe/0.log" Dec 01 10:29:56 crc kubenswrapper[4867]: I1201 10:29:56.560702 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-44ppb_d598d0dc-37e0-47ac-8fcd-597f70f1300b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:29:56 crc kubenswrapper[4867]: I1201 10:29:56.733271 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq_828b404a-aff1-4642-8893-d0ba513e520d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:29:56 crc kubenswrapper[4867]: I1201 10:29:56.846305 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-94nfw_cfe6379b-a971-4c4b-9cba-75f2f56de0b1/init/0.log" Dec 01 10:29:57 crc kubenswrapper[4867]: I1201 10:29:57.108788 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws_eb0d277f-4c89-46d6-8e05-e72c291e30cc/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:29:57 crc kubenswrapper[4867]: I1201 10:29:57.181601 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-94nfw_cfe6379b-a971-4c4b-9cba-75f2f56de0b1/init/0.log" Dec 01 10:29:57 crc kubenswrapper[4867]: I1201 10:29:57.226770 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-94nfw_cfe6379b-a971-4c4b-9cba-75f2f56de0b1/dnsmasq-dns/0.log" Dec 01 10:29:57 crc kubenswrapper[4867]: I1201 10:29:57.390145 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_1b458da1-78cd-4603-936e-e60b83594fad/glance-httpd/0.log" Dec 01 10:29:57 crc kubenswrapper[4867]: I1201 10:29:57.451745 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_1b458da1-78cd-4603-936e-e60b83594fad/glance-log/0.log" Dec 01 10:29:57 crc kubenswrapper[4867]: I1201 10:29:57.664313 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5f529607-d9e3-4605-8428-5903a9bab379/glance-log/0.log" Dec 01 10:29:57 crc kubenswrapper[4867]: I1201 10:29:57.672531 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5f529607-d9e3-4605-8428-5903a9bab379/glance-httpd/0.log" Dec 01 10:29:57 crc kubenswrapper[4867]: I1201 10:29:57.943293 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-d47c7cb76-srf4p_8bd4fac2-df2c-4aab-bf00-99b54a83ddca/horizon/2.log" Dec 01 10:29:58 crc kubenswrapper[4867]: I1201 10:29:58.081497 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-d47c7cb76-srf4p_8bd4fac2-df2c-4aab-bf00-99b54a83ddca/horizon/1.log" Dec 01 10:29:58 crc kubenswrapper[4867]: I1201 10:29:58.398712 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-d47c7cb76-srf4p_8bd4fac2-df2c-4aab-bf00-99b54a83ddca/horizon-log/0.log" Dec 01 10:29:58 crc kubenswrapper[4867]: I1201 10:29:58.467222 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-b8r49_c4e1c416-5403-4334-bf63-019f8546a2ab/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:29:58 crc kubenswrapper[4867]: I1201 10:29:58.547054 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-ktd88_93968ab3-45b8-4b7a-a395-8344714bb9e9/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:29:58 crc kubenswrapper[4867]: I1201 10:29:58.709985 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29409721-k9xqv_9a2a65a7-bbfb-40ee-bfe2-f99d1173daef/keystone-cron/0.log" Dec 01 10:29:58 crc kubenswrapper[4867]: I1201 10:29:58.893937 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_3d78f955-151d-46a9-9ef3-183051c318e6/kube-state-metrics/0.log" Dec 01 10:29:59 crc kubenswrapper[4867]: I1201 10:29:59.072350 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp_3e0cef16-29f1-49cf-aee1-7c5d9963aa81/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:29:59 crc kubenswrapper[4867]: I1201 10:29:59.186493 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7b56ffdd7f-kp95s_fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0/keystone-api/0.log" Dec 01 10:29:59 crc kubenswrapper[4867]: I1201 10:29:59.700438 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66_27c456e0-7f00-42b5-b4e7-c5120389d2c1/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:29:59 crc kubenswrapper[4867]: I1201 10:29:59.815433 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-59b9c878df-5k6nq_7dea6dbd-f761-4336-b755-0a2c82f6c66b/neutron-httpd/0.log" Dec 01 10:30:00 crc kubenswrapper[4867]: I1201 10:30:00.165736 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-59b9c878df-5k6nq_7dea6dbd-f761-4336-b755-0a2c82f6c66b/neutron-api/0.log" Dec 01 10:30:00 crc kubenswrapper[4867]: I1201 10:30:00.166153 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409750-kq6hw"] Dec 01 10:30:00 crc kubenswrapper[4867]: E1201 10:30:00.166719 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65bd8f93-f2ed-4fe7-bd38-77a38821c8dc" containerName="container-00" Dec 01 10:30:00 crc kubenswrapper[4867]: I1201 10:30:00.166877 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="65bd8f93-f2ed-4fe7-bd38-77a38821c8dc" containerName="container-00" Dec 01 10:30:00 crc kubenswrapper[4867]: I1201 10:30:00.167271 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="65bd8f93-f2ed-4fe7-bd38-77a38821c8dc" containerName="container-00" Dec 01 10:30:00 crc kubenswrapper[4867]: I1201 10:30:00.168121 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-kq6hw" Dec 01 10:30:00 crc kubenswrapper[4867]: I1201 10:30:00.179391 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 10:30:00 crc kubenswrapper[4867]: I1201 10:30:00.179799 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 10:30:00 crc kubenswrapper[4867]: I1201 10:30:00.185980 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409750-kq6hw"] Dec 01 10:30:00 crc kubenswrapper[4867]: I1201 10:30:00.219767 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c517207-fab3-441a-88f4-377be0659799-config-volume\") pod \"collect-profiles-29409750-kq6hw\" (UID: \"5c517207-fab3-441a-88f4-377be0659799\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-kq6hw" Dec 01 10:30:00 crc kubenswrapper[4867]: I1201 10:30:00.221265 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jktms\" (UniqueName: \"kubernetes.io/projected/5c517207-fab3-441a-88f4-377be0659799-kube-api-access-jktms\") pod \"collect-profiles-29409750-kq6hw\" (UID: \"5c517207-fab3-441a-88f4-377be0659799\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-kq6hw" Dec 01 10:30:00 crc kubenswrapper[4867]: I1201 10:30:00.225214 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c517207-fab3-441a-88f4-377be0659799-secret-volume\") pod \"collect-profiles-29409750-kq6hw\" (UID: \"5c517207-fab3-441a-88f4-377be0659799\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-kq6hw" Dec 01 10:30:00 crc kubenswrapper[4867]: I1201 10:30:00.327168 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c517207-fab3-441a-88f4-377be0659799-config-volume\") pod \"collect-profiles-29409750-kq6hw\" (UID: \"5c517207-fab3-441a-88f4-377be0659799\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-kq6hw" Dec 01 10:30:00 crc kubenswrapper[4867]: I1201 10:30:00.327223 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jktms\" (UniqueName: \"kubernetes.io/projected/5c517207-fab3-441a-88f4-377be0659799-kube-api-access-jktms\") pod \"collect-profiles-29409750-kq6hw\" (UID: \"5c517207-fab3-441a-88f4-377be0659799\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-kq6hw" Dec 01 10:30:00 crc kubenswrapper[4867]: I1201 10:30:00.327315 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c517207-fab3-441a-88f4-377be0659799-secret-volume\") pod \"collect-profiles-29409750-kq6hw\" (UID: \"5c517207-fab3-441a-88f4-377be0659799\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-kq6hw" Dec 01 10:30:00 crc kubenswrapper[4867]: I1201 10:30:00.329576 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c517207-fab3-441a-88f4-377be0659799-config-volume\") pod \"collect-profiles-29409750-kq6hw\" (UID: \"5c517207-fab3-441a-88f4-377be0659799\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-kq6hw" Dec 01 10:30:00 crc kubenswrapper[4867]: I1201 10:30:00.349496 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jktms\" (UniqueName: \"kubernetes.io/projected/5c517207-fab3-441a-88f4-377be0659799-kube-api-access-jktms\") pod \"collect-profiles-29409750-kq6hw\" (UID: \"5c517207-fab3-441a-88f4-377be0659799\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-kq6hw" Dec 01 10:30:00 crc kubenswrapper[4867]: I1201 10:30:00.374135 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c517207-fab3-441a-88f4-377be0659799-secret-volume\") pod \"collect-profiles-29409750-kq6hw\" (UID: \"5c517207-fab3-441a-88f4-377be0659799\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-kq6hw" Dec 01 10:30:00 crc kubenswrapper[4867]: I1201 10:30:00.504133 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-kq6hw" Dec 01 10:30:00 crc kubenswrapper[4867]: I1201 10:30:00.928769 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_3030542c-dee9-40e5-af75-53a0bbc22301/nova-cell0-conductor-conductor/0.log" Dec 01 10:30:00 crc kubenswrapper[4867]: I1201 10:30:00.976108 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_13963b70-5558-4e19-9b73-555d74be129a/nova-cell1-conductor-conductor/0.log" Dec 01 10:30:01 crc kubenswrapper[4867]: I1201 10:30:01.064930 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409750-kq6hw"] Dec 01 10:30:01 crc kubenswrapper[4867]: I1201 10:30:01.379186 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d178f07b-43d0-48ea-a5fe-898f68e80850/nova-cell1-novncproxy-novncproxy/0.log" Dec 01 10:30:01 crc kubenswrapper[4867]: I1201 10:30:01.547311 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_29c7ce91-10c4-45b8-ba1c-db503ab7d5a7/nova-api-log/0.log" Dec 01 10:30:01 crc kubenswrapper[4867]: I1201 10:30:01.616517 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-kq6hw" event={"ID":"5c517207-fab3-441a-88f4-377be0659799","Type":"ContainerStarted","Data":"62797d2afd98200b08a3ebd1b8a2e93ed3fa11270d8dd0c3eae9ea8e84b59f6d"} Dec 01 10:30:01 crc kubenswrapper[4867]: I1201 10:30:01.819315 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_29c7ce91-10c4-45b8-ba1c-db503ab7d5a7/nova-api-api/0.log" Dec 01 10:30:02 crc kubenswrapper[4867]: I1201 10:30:02.345163 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2f8e90f5-24d7-406e-a2aa-d44b9e6bac71/nova-metadata-log/0.log" Dec 01 10:30:02 crc kubenswrapper[4867]: I1201 10:30:02.451157 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-lmcp6_25628db2-c71e-4e6e-bfa2-d753bfc7fb89/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:30:02 crc kubenswrapper[4867]: I1201 10:30:02.653145 4867 generic.go:334] "Generic (PLEG): container finished" podID="5c517207-fab3-441a-88f4-377be0659799" containerID="d9d00594f32b03277f6e32eea1d707e2898b939e8712fc8cf4fb42054fca071f" exitCode=0 Dec 01 10:30:02 crc kubenswrapper[4867]: I1201 10:30:02.653190 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-kq6hw" event={"ID":"5c517207-fab3-441a-88f4-377be0659799","Type":"ContainerDied","Data":"d9d00594f32b03277f6e32eea1d707e2898b939e8712fc8cf4fb42054fca071f"} Dec 01 10:30:03 crc kubenswrapper[4867]: I1201 10:30:03.119833 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_31106653-bdaa-49c3-b14c-8eb180b0b2c3/mysql-bootstrap/0.log" Dec 01 10:30:03 crc kubenswrapper[4867]: I1201 10:30:03.253054 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d0ac7269-f887-4d9f-a582-6726a5be70f7/nova-scheduler-scheduler/0.log" Dec 01 10:30:03 crc kubenswrapper[4867]: I1201 10:30:03.627095 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_31106653-bdaa-49c3-b14c-8eb180b0b2c3/mysql-bootstrap/0.log" Dec 01 10:30:03 crc kubenswrapper[4867]: I1201 10:30:03.732288 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_31106653-bdaa-49c3-b14c-8eb180b0b2c3/galera/0.log" Dec 01 10:30:04 crc kubenswrapper[4867]: I1201 10:30:04.030277 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7a36be7a-7b6d-443d-94c6-4b3bdff15ec8/mysql-bootstrap/0.log" Dec 01 10:30:04 crc kubenswrapper[4867]: I1201 10:30:04.100165 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-kq6hw" Dec 01 10:30:04 crc kubenswrapper[4867]: I1201 10:30:04.110071 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7a36be7a-7b6d-443d-94c6-4b3bdff15ec8/galera/0.log" Dec 01 10:30:04 crc kubenswrapper[4867]: I1201 10:30:04.241929 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c517207-fab3-441a-88f4-377be0659799-secret-volume\") pod \"5c517207-fab3-441a-88f4-377be0659799\" (UID: \"5c517207-fab3-441a-88f4-377be0659799\") " Dec 01 10:30:04 crc kubenswrapper[4867]: I1201 10:30:04.242014 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c517207-fab3-441a-88f4-377be0659799-config-volume\") pod \"5c517207-fab3-441a-88f4-377be0659799\" (UID: \"5c517207-fab3-441a-88f4-377be0659799\") " Dec 01 10:30:04 crc kubenswrapper[4867]: I1201 10:30:04.242943 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jktms\" (UniqueName: \"kubernetes.io/projected/5c517207-fab3-441a-88f4-377be0659799-kube-api-access-jktms\") pod \"5c517207-fab3-441a-88f4-377be0659799\" (UID: \"5c517207-fab3-441a-88f4-377be0659799\") " Dec 01 10:30:04 crc kubenswrapper[4867]: I1201 10:30:04.243105 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c517207-fab3-441a-88f4-377be0659799-config-volume" (OuterVolumeSpecName: "config-volume") pod "5c517207-fab3-441a-88f4-377be0659799" (UID: "5c517207-fab3-441a-88f4-377be0659799"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:30:04 crc kubenswrapper[4867]: I1201 10:30:04.243803 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c517207-fab3-441a-88f4-377be0659799-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:30:04 crc kubenswrapper[4867]: I1201 10:30:04.249711 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c517207-fab3-441a-88f4-377be0659799-kube-api-access-jktms" (OuterVolumeSpecName: "kube-api-access-jktms") pod "5c517207-fab3-441a-88f4-377be0659799" (UID: "5c517207-fab3-441a-88f4-377be0659799"). InnerVolumeSpecName "kube-api-access-jktms". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:30:04 crc kubenswrapper[4867]: I1201 10:30:04.253764 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7a36be7a-7b6d-443d-94c6-4b3bdff15ec8/mysql-bootstrap/0.log" Dec 01 10:30:04 crc kubenswrapper[4867]: I1201 10:30:04.272444 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c517207-fab3-441a-88f4-377be0659799-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5c517207-fab3-441a-88f4-377be0659799" (UID: "5c517207-fab3-441a-88f4-377be0659799"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:30:04 crc kubenswrapper[4867]: I1201 10:30:04.349667 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jktms\" (UniqueName: \"kubernetes.io/projected/5c517207-fab3-441a-88f4-377be0659799-kube-api-access-jktms\") on node \"crc\" DevicePath \"\"" Dec 01 10:30:04 crc kubenswrapper[4867]: I1201 10:30:04.354167 4867 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c517207-fab3-441a-88f4-377be0659799-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:30:04 crc kubenswrapper[4867]: I1201 10:30:04.466087 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_e2fbdcd3-0c11-4681-99af-c9b4fb717637/openstackclient/0.log" Dec 01 10:30:04 crc kubenswrapper[4867]: I1201 10:30:04.606211 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2f8e90f5-24d7-406e-a2aa-d44b9e6bac71/nova-metadata-metadata/0.log" Dec 01 10:30:04 crc kubenswrapper[4867]: I1201 10:30:04.693890 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-kq6hw" event={"ID":"5c517207-fab3-441a-88f4-377be0659799","Type":"ContainerDied","Data":"62797d2afd98200b08a3ebd1b8a2e93ed3fa11270d8dd0c3eae9ea8e84b59f6d"} Dec 01 10:30:04 crc kubenswrapper[4867]: I1201 10:30:04.693935 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62797d2afd98200b08a3ebd1b8a2e93ed3fa11270d8dd0c3eae9ea8e84b59f6d" Dec 01 10:30:04 crc kubenswrapper[4867]: I1201 10:30:04.694051 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-kq6hw" Dec 01 10:30:04 crc kubenswrapper[4867]: I1201 10:30:04.966313 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-pfvw2_feff7c40-c771-4824-b3f0-75c4d527044a/openstack-network-exporter/0.log" Dec 01 10:30:04 crc kubenswrapper[4867]: I1201 10:30:04.968533 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9jsgc_91f709b6-1fa8-40fb-80a0-45ea9510b009/ovsdb-server-init/0.log" Dec 01 10:30:05 crc kubenswrapper[4867]: I1201 10:30:05.200066 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t"] Dec 01 10:30:05 crc kubenswrapper[4867]: I1201 10:30:05.206508 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9jsgc_91f709b6-1fa8-40fb-80a0-45ea9510b009/ovs-vswitchd/0.log" Dec 01 10:30:05 crc kubenswrapper[4867]: I1201 10:30:05.213382 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409705-6w67t"] Dec 01 10:30:05 crc kubenswrapper[4867]: I1201 10:30:05.218412 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9jsgc_91f709b6-1fa8-40fb-80a0-45ea9510b009/ovsdb-server-init/0.log" Dec 01 10:30:05 crc kubenswrapper[4867]: I1201 10:30:05.291656 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9jsgc_91f709b6-1fa8-40fb-80a0-45ea9510b009/ovsdb-server/0.log" Dec 01 10:30:05 crc kubenswrapper[4867]: I1201 10:30:05.500409 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-vmh2x_aa810b5f-4cad-40cc-9feb-6afc38b56ab1/ovn-controller/0.log" Dec 01 10:30:05 crc kubenswrapper[4867]: I1201 10:30:05.734181 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-wpt2t_5f9d5cec-8d85-4f56-b876-06c32bb0a3e7/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:30:05 crc kubenswrapper[4867]: I1201 10:30:05.790508 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_826ca141-06c3-45c3-9d5a-e99985971b80/openstack-network-exporter/0.log" Dec 01 10:30:05 crc kubenswrapper[4867]: I1201 10:30:05.814047 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_826ca141-06c3-45c3-9d5a-e99985971b80/ovn-northd/0.log" Dec 01 10:30:05 crc kubenswrapper[4867]: I1201 10:30:05.990757 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b1081877-3550-4ad4-9a89-a5cddfc4ba31/openstack-network-exporter/0.log" Dec 01 10:30:06 crc kubenswrapper[4867]: I1201 10:30:06.086481 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b1081877-3550-4ad4-9a89-a5cddfc4ba31/ovsdbserver-nb/0.log" Dec 01 10:30:06 crc kubenswrapper[4867]: I1201 10:30:06.552335 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df/openstack-network-exporter/0.log" Dec 01 10:30:06 crc kubenswrapper[4867]: I1201 10:30:06.740297 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df/ovsdbserver-sb/0.log" Dec 01 10:30:06 crc kubenswrapper[4867]: I1201 10:30:06.859708 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="076005a6-715b-44d7-9162-423fc55eb31d" path="/var/lib/kubelet/pods/076005a6-715b-44d7-9162-423fc55eb31d/volumes" Dec 01 10:30:07 crc kubenswrapper[4867]: I1201 10:30:07.146505 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1a327b42-8b19-491b-a9ba-2c11f0227183/setup-container/0.log" Dec 01 10:30:07 crc kubenswrapper[4867]: I1201 10:30:07.175668 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-68bfcdf768-4dtj7_a57f081c-e4b7-4dbb-a817-4d36052f3145/placement-api/0.log" Dec 01 10:30:07 crc kubenswrapper[4867]: I1201 10:30:07.272286 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-68bfcdf768-4dtj7_a57f081c-e4b7-4dbb-a817-4d36052f3145/placement-log/0.log" Dec 01 10:30:07 crc kubenswrapper[4867]: I1201 10:30:07.407143 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1a327b42-8b19-491b-a9ba-2c11f0227183/rabbitmq/0.log" Dec 01 10:30:07 crc kubenswrapper[4867]: I1201 10:30:07.449197 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1a327b42-8b19-491b-a9ba-2c11f0227183/setup-container/0.log" Dec 01 10:30:07 crc kubenswrapper[4867]: I1201 10:30:07.616004 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e3936faf-3dae-4db5-8851-10c1ebe7673b/setup-container/0.log" Dec 01 10:30:07 crc kubenswrapper[4867]: I1201 10:30:07.869385 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e3936faf-3dae-4db5-8851-10c1ebe7673b/setup-container/0.log" Dec 01 10:30:07 crc kubenswrapper[4867]: I1201 10:30:07.877920 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e3936faf-3dae-4db5-8851-10c1ebe7673b/rabbitmq/0.log" Dec 01 10:30:07 crc kubenswrapper[4867]: I1201 10:30:07.982602 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k_4ae71b24-0c2d-46fa-a8e4-3fb8261f6817/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:30:08 crc kubenswrapper[4867]: I1201 10:30:08.297541 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-pk2lp_1560ec78-ac43-47a7-ab73-69a7decf4ed8/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:30:08 crc kubenswrapper[4867]: I1201 10:30:08.372476 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm_d8a14454-7ca8-4a2d-8626-5234d29dd688/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:30:08 crc kubenswrapper[4867]: I1201 10:30:08.582480 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gp785_612c2304-16fe-4932-824d-6116da3a4fb8/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:30:08 crc kubenswrapper[4867]: I1201 10:30:08.734040 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-2glfk_b20bdd78-fe72-4ce9-b909-440d2e47e153/ssh-known-hosts-edpm-deployment/0.log" Dec 01 10:30:09 crc kubenswrapper[4867]: I1201 10:30:09.059797 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9dcc6b98f-chkvz_476caa3a-28ba-471d-b4c0-c263c5960a87/proxy-server/0.log" Dec 01 10:30:09 crc kubenswrapper[4867]: I1201 10:30:09.120999 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9dcc6b98f-chkvz_476caa3a-28ba-471d-b4c0-c263c5960a87/proxy-httpd/0.log" Dec 01 10:30:09 crc kubenswrapper[4867]: I1201 10:30:09.317198 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-n24dx_14b301a3-7288-471a-8ca4-cb7f4dca4b96/swift-ring-rebalance/0.log" Dec 01 10:30:09 crc kubenswrapper[4867]: I1201 10:30:09.427023 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/account-auditor/0.log" Dec 01 10:30:09 crc kubenswrapper[4867]: I1201 10:30:09.475527 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/account-reaper/0.log" Dec 01 10:30:09 crc kubenswrapper[4867]: I1201 10:30:09.604367 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/account-replicator/0.log" Dec 01 10:30:09 crc kubenswrapper[4867]: I1201 10:30:09.869638 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/container-replicator/0.log" Dec 01 10:30:09 crc kubenswrapper[4867]: I1201 10:30:09.903544 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/container-auditor/0.log" Dec 01 10:30:09 crc kubenswrapper[4867]: I1201 10:30:09.943517 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/account-server/0.log" Dec 01 10:30:10 crc kubenswrapper[4867]: I1201 10:30:10.333919 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/container-server/0.log" Dec 01 10:30:10 crc kubenswrapper[4867]: I1201 10:30:10.910676 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/container-updater/0.log" Dec 01 10:30:10 crc kubenswrapper[4867]: I1201 10:30:10.918594 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/object-expirer/0.log" Dec 01 10:30:10 crc kubenswrapper[4867]: I1201 10:30:10.946515 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/object-replicator/0.log" Dec 01 10:30:10 crc kubenswrapper[4867]: I1201 10:30:10.952028 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/object-auditor/0.log" Dec 01 10:30:11 crc kubenswrapper[4867]: I1201 10:30:11.192052 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/rsync/0.log" Dec 01 10:30:11 crc kubenswrapper[4867]: I1201 10:30:11.257304 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/swift-recon-cron/0.log" Dec 01 10:30:11 crc kubenswrapper[4867]: I1201 10:30:11.264181 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/object-updater/0.log" Dec 01 10:30:11 crc kubenswrapper[4867]: I1201 10:30:11.272579 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/object-server/0.log" Dec 01 10:30:11 crc kubenswrapper[4867]: I1201 10:30:11.642203 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_31b3d747-c383-483d-8919-be1dd3a266b6/tempest-tests-tempest-tests-runner/0.log" Dec 01 10:30:11 crc kubenswrapper[4867]: I1201 10:30:11.775440 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-c8df6_8a874825-a4d4-446d-b1fe-3317e3b67d55/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:30:11 crc kubenswrapper[4867]: I1201 10:30:11.959345 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_640e57ba-3d94-41be-bcd7-0c5eeff8a092/test-operator-logs-container/0.log" Dec 01 10:30:12 crc kubenswrapper[4867]: I1201 10:30:12.511662 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8_21f6bea0-2abe-4029-8272-f6da0825cf69/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:30:21 crc kubenswrapper[4867]: I1201 10:30:21.602007 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:30:21 crc kubenswrapper[4867]: I1201 10:30:21.602605 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:30:22 crc kubenswrapper[4867]: I1201 10:30:22.729987 4867 scope.go:117] "RemoveContainer" containerID="1354c2e646aa137f7bf8a4e3b810ae84b8997b03427351a0bed35aef4fe9d328" Dec 01 10:30:26 crc kubenswrapper[4867]: I1201 10:30:26.329056 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_2a737446-2c4b-44f1-b660-9e433c5eb2d1/memcached/0.log" Dec 01 10:30:47 crc kubenswrapper[4867]: I1201 10:30:47.191177 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm_438274bb-f607-4eef-af53-59566f7176d1/util/0.log" Dec 01 10:30:47 crc kubenswrapper[4867]: I1201 10:30:47.417131 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm_438274bb-f607-4eef-af53-59566f7176d1/util/0.log" Dec 01 10:30:47 crc kubenswrapper[4867]: I1201 10:30:47.460172 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm_438274bb-f607-4eef-af53-59566f7176d1/pull/0.log" Dec 01 10:30:47 crc kubenswrapper[4867]: I1201 10:30:47.470069 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm_438274bb-f607-4eef-af53-59566f7176d1/pull/0.log" Dec 01 10:30:47 crc kubenswrapper[4867]: I1201 10:30:47.775646 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm_438274bb-f607-4eef-af53-59566f7176d1/extract/0.log" Dec 01 10:30:47 crc kubenswrapper[4867]: I1201 10:30:47.781368 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm_438274bb-f607-4eef-af53-59566f7176d1/util/0.log" Dec 01 10:30:47 crc kubenswrapper[4867]: I1201 10:30:47.821122 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm_438274bb-f607-4eef-af53-59566f7176d1/pull/0.log" Dec 01 10:30:48 crc kubenswrapper[4867]: I1201 10:30:47.999801 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-nrm56_0e850850-d946-42aa-a035-1bf8dcba402f/kube-rbac-proxy/0.log" Dec 01 10:30:48 crc kubenswrapper[4867]: I1201 10:30:48.080584 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-4wbsd_c10410e7-47b2-4a48-bf7d-440a00afd4b4/kube-rbac-proxy/0.log" Dec 01 10:30:48 crc kubenswrapper[4867]: I1201 10:30:48.164082 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-nrm56_0e850850-d946-42aa-a035-1bf8dcba402f/manager/0.log" Dec 01 10:30:48 crc kubenswrapper[4867]: I1201 10:30:48.539438 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-4wbsd_c10410e7-47b2-4a48-bf7d-440a00afd4b4/manager/0.log" Dec 01 10:30:48 crc kubenswrapper[4867]: I1201 10:30:48.574431 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-9nc4v_0deeeac8-147f-441c-ba67-2e6e9bc32073/kube-rbac-proxy/0.log" Dec 01 10:30:48 crc kubenswrapper[4867]: I1201 10:30:48.653167 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-9nc4v_0deeeac8-147f-441c-ba67-2e6e9bc32073/manager/0.log" Dec 01 10:30:48 crc kubenswrapper[4867]: I1201 10:30:48.787586 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-vktv2_68e139fd-19f5-4033-93b8-4ebf8397b510/kube-rbac-proxy/0.log" Dec 01 10:30:48 crc kubenswrapper[4867]: I1201 10:30:48.849091 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-vktv2_68e139fd-19f5-4033-93b8-4ebf8397b510/manager/0.log" Dec 01 10:30:49 crc kubenswrapper[4867]: I1201 10:30:49.060244 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-zdllr_0d369519-2f02-4efe-9deb-885362964597/kube-rbac-proxy/0.log" Dec 01 10:30:49 crc kubenswrapper[4867]: I1201 10:30:49.064203 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-zdllr_0d369519-2f02-4efe-9deb-885362964597/manager/0.log" Dec 01 10:30:49 crc kubenswrapper[4867]: I1201 10:30:49.179178 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-p7rms_468cf199-ea48-4a5a-ac34-057670369f66/kube-rbac-proxy/0.log" Dec 01 10:30:49 crc kubenswrapper[4867]: I1201 10:30:49.296087 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-p7rms_468cf199-ea48-4a5a-ac34-057670369f66/manager/0.log" Dec 01 10:30:49 crc kubenswrapper[4867]: I1201 10:30:49.450405 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-24whr_b5f9e64b-a7d0-4437-91ac-f84c2441cd8d/kube-rbac-proxy/0.log" Dec 01 10:30:49 crc kubenswrapper[4867]: I1201 10:30:49.624392 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-b4j75_fd8d1846-f143-4ca0-88df-af3eca96175d/kube-rbac-proxy/0.log" Dec 01 10:30:49 crc kubenswrapper[4867]: I1201 10:30:49.653779 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-24whr_b5f9e64b-a7d0-4437-91ac-f84c2441cd8d/manager/0.log" Dec 01 10:30:49 crc kubenswrapper[4867]: I1201 10:30:49.679098 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-b4j75_fd8d1846-f143-4ca0-88df-af3eca96175d/manager/0.log" Dec 01 10:30:50 crc kubenswrapper[4867]: I1201 10:30:50.388404 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-492tf_07b4a3c9-b63d-4a6f-9227-e2cd767f9d9a/kube-rbac-proxy/0.log" Dec 01 10:30:50 crc kubenswrapper[4867]: I1201 10:30:50.552631 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-492tf_07b4a3c9-b63d-4a6f-9227-e2cd767f9d9a/manager/0.log" Dec 01 10:30:51 crc kubenswrapper[4867]: I1201 10:30:51.045706 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-hlksd_a8956c5b-7421-4442-8d62-773a5fe02fd0/kube-rbac-proxy/0.log" Dec 01 10:30:51 crc kubenswrapper[4867]: I1201 10:30:51.047032 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-hlksd_a8956c5b-7421-4442-8d62-773a5fe02fd0/manager/0.log" Dec 01 10:30:51 crc kubenswrapper[4867]: I1201 10:30:51.095209 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-77sbx_cb8d2624-ad08-41e7-bb2a-48bc75a2dd62/kube-rbac-proxy/0.log" Dec 01 10:30:51 crc kubenswrapper[4867]: I1201 10:30:51.254044 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-77sbx_cb8d2624-ad08-41e7-bb2a-48bc75a2dd62/manager/0.log" Dec 01 10:30:51 crc kubenswrapper[4867]: I1201 10:30:51.326824 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-twg2p_30c79a23-86f2-4a05-adde-41ada03e2e7e/manager/0.log" Dec 01 10:30:51 crc kubenswrapper[4867]: I1201 10:30:51.329103 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-twg2p_30c79a23-86f2-4a05-adde-41ada03e2e7e/kube-rbac-proxy/0.log" Dec 01 10:30:51 crc kubenswrapper[4867]: I1201 10:30:51.500973 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-sjmfh_656a9362-30cf-43f6-9909-95859bef129e/kube-rbac-proxy/0.log" Dec 01 10:30:51 crc kubenswrapper[4867]: I1201 10:30:51.582261 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-mxkvs_e9fd074d-b9bc-4215-bfd7-56df604f101c/kube-rbac-proxy/0.log" Dec 01 10:30:51 crc kubenswrapper[4867]: I1201 10:30:51.600756 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:30:51 crc kubenswrapper[4867]: I1201 10:30:51.600801 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:30:51 crc kubenswrapper[4867]: I1201 10:30:51.648633 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-sjmfh_656a9362-30cf-43f6-9909-95859bef129e/manager/0.log" Dec 01 10:30:51 crc kubenswrapper[4867]: I1201 10:30:51.778018 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4_4d73996e-90d0-44f5-85f9-3800f54fc3d7/kube-rbac-proxy/0.log" Dec 01 10:30:51 crc kubenswrapper[4867]: I1201 10:30:51.796694 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-mxkvs_e9fd074d-b9bc-4215-bfd7-56df604f101c/manager/0.log" Dec 01 10:30:51 crc kubenswrapper[4867]: I1201 10:30:51.848068 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4_4d73996e-90d0-44f5-85f9-3800f54fc3d7/manager/0.log" Dec 01 10:30:52 crc kubenswrapper[4867]: I1201 10:30:52.237053 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zvsh9_cbb9c171-f076-44a2-9a0a-fafd9aa101ca/registry-server/0.log" Dec 01 10:30:52 crc kubenswrapper[4867]: I1201 10:30:52.315973 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-66fc949795-bpdpc_8072c3c3-367c-47af-952b-f303a97d1afe/operator/0.log" Dec 01 10:30:52 crc kubenswrapper[4867]: I1201 10:30:52.392601 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-grrzp_461764b0-73a3-4866-aec1-e687293591e3/kube-rbac-proxy/0.log" Dec 01 10:30:52 crc kubenswrapper[4867]: I1201 10:30:52.643796 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-grrzp_461764b0-73a3-4866-aec1-e687293591e3/manager/0.log" Dec 01 10:30:52 crc kubenswrapper[4867]: I1201 10:30:52.729701 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-bhgk8_ffbd9e52-147b-42cd-abaa-ec7d1341b826/manager/0.log" Dec 01 10:30:52 crc kubenswrapper[4867]: I1201 10:30:52.758021 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-bhgk8_ffbd9e52-147b-42cd-abaa-ec7d1341b826/kube-rbac-proxy/0.log" Dec 01 10:30:53 crc kubenswrapper[4867]: I1201 10:30:53.059907 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-68x8r_bb2d9cc0-c5d4-4abe-874e-8ce801c6cbdf/operator/0.log" Dec 01 10:30:53 crc kubenswrapper[4867]: I1201 10:30:53.082898 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-j698r_f3176675-0a3a-4fd2-9727-349ec1b88de7/kube-rbac-proxy/0.log" Dec 01 10:30:53 crc kubenswrapper[4867]: I1201 10:30:53.236620 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-j698r_f3176675-0a3a-4fd2-9727-349ec1b88de7/manager/0.log" Dec 01 10:30:53 crc kubenswrapper[4867]: I1201 10:30:53.271254 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-56cfc94774-wn77q_860dbd82-4e88-4090-8ce6-658e3201ef67/manager/0.log" Dec 01 10:30:53 crc kubenswrapper[4867]: I1201 10:30:53.428777 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-l7jwc_9be92a6c-afef-449e-927b-8d0732a2140a/kube-rbac-proxy/0.log" Dec 01 10:30:53 crc kubenswrapper[4867]: I1201 10:30:53.454700 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-l7jwc_9be92a6c-afef-449e-927b-8d0732a2140a/manager/0.log" Dec 01 10:30:53 crc kubenswrapper[4867]: I1201 10:30:53.544260 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-g2ddn_573accdf-9cb1-4643-af86-744e695a1f9d/kube-rbac-proxy/0.log" Dec 01 10:30:53 crc kubenswrapper[4867]: I1201 10:30:53.589102 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-g2ddn_573accdf-9cb1-4643-af86-744e695a1f9d/manager/0.log" Dec 01 10:30:53 crc kubenswrapper[4867]: I1201 10:30:53.668127 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-swrc5_c900776b-c7ea-4e4d-9b6b-00245cf048ce/kube-rbac-proxy/0.log" Dec 01 10:30:53 crc kubenswrapper[4867]: I1201 10:30:53.722755 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-swrc5_c900776b-c7ea-4e4d-9b6b-00245cf048ce/manager/0.log" Dec 01 10:31:16 crc kubenswrapper[4867]: I1201 10:31:16.870557 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-mph7x_678a6c46-6e4c-4ec0-aa74-7c89c6dc00b5/control-plane-machine-set-operator/0.log" Dec 01 10:31:17 crc kubenswrapper[4867]: I1201 10:31:17.045673 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-545ws_e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a/kube-rbac-proxy/0.log" Dec 01 10:31:17 crc kubenswrapper[4867]: I1201 10:31:17.198389 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-545ws_e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a/machine-api-operator/0.log" Dec 01 10:31:21 crc kubenswrapper[4867]: I1201 10:31:21.601569 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:31:21 crc kubenswrapper[4867]: I1201 10:31:21.602194 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:31:21 crc kubenswrapper[4867]: I1201 10:31:21.602251 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 10:31:21 crc kubenswrapper[4867]: I1201 10:31:21.616167 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb"} pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:31:21 crc kubenswrapper[4867]: I1201 10:31:21.616260 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" containerID="cri-o://47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" gracePeriod=600 Dec 01 10:31:21 crc kubenswrapper[4867]: E1201 10:31:21.766091 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:31:22 crc kubenswrapper[4867]: I1201 10:31:22.467145 4867 generic.go:334] "Generic (PLEG): container finished" podID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" exitCode=0 Dec 01 10:31:22 crc kubenswrapper[4867]: I1201 10:31:22.467341 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerDied","Data":"47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb"} Dec 01 10:31:22 crc kubenswrapper[4867]: I1201 10:31:22.467578 4867 scope.go:117] "RemoveContainer" containerID="d5239f7b77b22875fa81a2f4170c3ec389ab40f1ee4ab2299324fc1ac724ac7b" Dec 01 10:31:22 crc kubenswrapper[4867]: I1201 10:31:22.468475 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:31:22 crc kubenswrapper[4867]: E1201 10:31:22.468893 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:31:32 crc kubenswrapper[4867]: I1201 10:31:32.174765 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-78xzp_210c03bc-36ef-4bc0-ba17-db783a56d470/cert-manager-controller/0.log" Dec 01 10:31:32 crc kubenswrapper[4867]: I1201 10:31:32.375246 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-njkbq_6b09ecf0-40b3-4271-97da-a662a4b427d6/cert-manager-cainjector/0.log" Dec 01 10:31:32 crc kubenswrapper[4867]: I1201 10:31:32.431464 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-nwld2_de894f99-4158-4096-b100-4758130c6c12/cert-manager-webhook/0.log" Dec 01 10:31:37 crc kubenswrapper[4867]: I1201 10:31:37.828184 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:31:37 crc kubenswrapper[4867]: E1201 10:31:37.828917 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:31:47 crc kubenswrapper[4867]: I1201 10:31:47.648758 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-fzb6f_f6e4c850-11f6-495b-a90f-5936dda915e7/nmstate-console-plugin/0.log" Dec 01 10:31:47 crc kubenswrapper[4867]: I1201 10:31:47.894057 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-cbknx_7a3fd2df-a271-4ff0-8488-7f442aedf04e/nmstate-handler/0.log" Dec 01 10:31:47 crc kubenswrapper[4867]: I1201 10:31:47.990115 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-dnfqt_4749ce2f-6e1e-47ef-a5f1-bdd63f409214/kube-rbac-proxy/0.log" Dec 01 10:31:48 crc kubenswrapper[4867]: I1201 10:31:48.028905 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-dnfqt_4749ce2f-6e1e-47ef-a5f1-bdd63f409214/nmstate-metrics/0.log" Dec 01 10:31:48 crc kubenswrapper[4867]: I1201 10:31:48.863873 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-wthkz_92290d91-f34b-4ef8-a2a7-15ed05a8c2a5/nmstate-operator/0.log" Dec 01 10:31:48 crc kubenswrapper[4867]: I1201 10:31:48.940149 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-s47c8_e2549111-bcf2-4c87-abdd-0d4cd9353be9/nmstate-webhook/0.log" Dec 01 10:31:49 crc kubenswrapper[4867]: I1201 10:31:49.826966 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:31:49 crc kubenswrapper[4867]: E1201 10:31:49.827706 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:32:03 crc kubenswrapper[4867]: I1201 10:32:03.827307 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:32:03 crc kubenswrapper[4867]: E1201 10:32:03.828170 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:32:05 crc kubenswrapper[4867]: I1201 10:32:05.051668 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-hfnbg_cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a/kube-rbac-proxy/0.log" Dec 01 10:32:05 crc kubenswrapper[4867]: I1201 10:32:05.149547 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-hfnbg_cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a/controller/0.log" Dec 01 10:32:05 crc kubenswrapper[4867]: I1201 10:32:05.294469 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-frr-files/0.log" Dec 01 10:32:05 crc kubenswrapper[4867]: I1201 10:32:05.446293 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-frr-files/0.log" Dec 01 10:32:05 crc kubenswrapper[4867]: I1201 10:32:05.532297 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-reloader/0.log" Dec 01 10:32:05 crc kubenswrapper[4867]: I1201 10:32:05.565742 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-metrics/0.log" Dec 01 10:32:05 crc kubenswrapper[4867]: I1201 10:32:05.573155 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-reloader/0.log" Dec 01 10:32:05 crc kubenswrapper[4867]: I1201 10:32:05.777211 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-reloader/0.log" Dec 01 10:32:05 crc kubenswrapper[4867]: I1201 10:32:05.777417 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-frr-files/0.log" Dec 01 10:32:05 crc kubenswrapper[4867]: I1201 10:32:05.839882 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-metrics/0.log" Dec 01 10:32:05 crc kubenswrapper[4867]: I1201 10:32:05.861644 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-metrics/0.log" Dec 01 10:32:06 crc kubenswrapper[4867]: I1201 10:32:06.148137 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-frr-files/0.log" Dec 01 10:32:06 crc kubenswrapper[4867]: I1201 10:32:06.156070 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-reloader/0.log" Dec 01 10:32:06 crc kubenswrapper[4867]: I1201 10:32:06.264382 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/controller/0.log" Dec 01 10:32:06 crc kubenswrapper[4867]: I1201 10:32:06.267978 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-metrics/0.log" Dec 01 10:32:06 crc kubenswrapper[4867]: I1201 10:32:06.382649 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/frr-metrics/0.log" Dec 01 10:32:06 crc kubenswrapper[4867]: I1201 10:32:06.495880 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/kube-rbac-proxy-frr/0.log" Dec 01 10:32:06 crc kubenswrapper[4867]: I1201 10:32:06.534636 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/kube-rbac-proxy/0.log" Dec 01 10:32:06 crc kubenswrapper[4867]: I1201 10:32:06.670167 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/reloader/0.log" Dec 01 10:32:06 crc kubenswrapper[4867]: I1201 10:32:06.892488 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-4bvvs_d2bcb3a5-5fb9-4c77-9f79-6d88033b8669/frr-k8s-webhook-server/0.log" Dec 01 10:32:07 crc kubenswrapper[4867]: I1201 10:32:07.087935 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6d79d8d46b-pxjk5_82e433dd-78d1-4cb0-a670-e19c67e09515/manager/0.log" Dec 01 10:32:07 crc kubenswrapper[4867]: I1201 10:32:07.334461 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6f489b594c-qqhv4_cb69f179-7caf-472c-9b20-f327c116f4a2/webhook-server/0.log" Dec 01 10:32:07 crc kubenswrapper[4867]: I1201 10:32:07.519759 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bczj5_c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3/kube-rbac-proxy/0.log" Dec 01 10:32:07 crc kubenswrapper[4867]: I1201 10:32:07.666216 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/frr/0.log" Dec 01 10:32:08 crc kubenswrapper[4867]: I1201 10:32:08.002512 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bczj5_c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3/speaker/0.log" Dec 01 10:32:17 crc kubenswrapper[4867]: I1201 10:32:17.826474 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:32:17 crc kubenswrapper[4867]: E1201 10:32:17.827341 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:32:22 crc kubenswrapper[4867]: I1201 10:32:22.849209 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb_eada6e99-1300-49a6-8732-f4b2024526dc/util/0.log" Dec 01 10:32:23 crc kubenswrapper[4867]: I1201 10:32:23.067434 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb_eada6e99-1300-49a6-8732-f4b2024526dc/pull/0.log" Dec 01 10:32:23 crc kubenswrapper[4867]: I1201 10:32:23.082723 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb_eada6e99-1300-49a6-8732-f4b2024526dc/pull/0.log" Dec 01 10:32:23 crc kubenswrapper[4867]: I1201 10:32:23.134475 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb_eada6e99-1300-49a6-8732-f4b2024526dc/util/0.log" Dec 01 10:32:23 crc kubenswrapper[4867]: I1201 10:32:23.341536 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb_eada6e99-1300-49a6-8732-f4b2024526dc/pull/0.log" Dec 01 10:32:23 crc kubenswrapper[4867]: I1201 10:32:23.349865 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb_eada6e99-1300-49a6-8732-f4b2024526dc/util/0.log" Dec 01 10:32:23 crc kubenswrapper[4867]: I1201 10:32:23.373133 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb_eada6e99-1300-49a6-8732-f4b2024526dc/extract/0.log" Dec 01 10:32:23 crc kubenswrapper[4867]: I1201 10:32:23.518757 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb_22246b0d-5ca8-4aa8-9cb5-0942b473e733/util/0.log" Dec 01 10:32:23 crc kubenswrapper[4867]: I1201 10:32:23.677421 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb_22246b0d-5ca8-4aa8-9cb5-0942b473e733/pull/0.log" Dec 01 10:32:23 crc kubenswrapper[4867]: I1201 10:32:23.683938 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb_22246b0d-5ca8-4aa8-9cb5-0942b473e733/pull/0.log" Dec 01 10:32:23 crc kubenswrapper[4867]: I1201 10:32:23.693610 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb_22246b0d-5ca8-4aa8-9cb5-0942b473e733/util/0.log" Dec 01 10:32:23 crc kubenswrapper[4867]: I1201 10:32:23.920086 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb_22246b0d-5ca8-4aa8-9cb5-0942b473e733/extract/0.log" Dec 01 10:32:24 crc kubenswrapper[4867]: I1201 10:32:24.054766 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb_22246b0d-5ca8-4aa8-9cb5-0942b473e733/pull/0.log" Dec 01 10:32:24 crc kubenswrapper[4867]: I1201 10:32:24.122189 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r6wmp_62ccb15d-a0d6-4799-87e4-99cf2489fa16/extract-utilities/0.log" Dec 01 10:32:24 crc kubenswrapper[4867]: I1201 10:32:24.122433 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb_22246b0d-5ca8-4aa8-9cb5-0942b473e733/util/0.log" Dec 01 10:32:24 crc kubenswrapper[4867]: I1201 10:32:24.306198 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r6wmp_62ccb15d-a0d6-4799-87e4-99cf2489fa16/extract-utilities/0.log" Dec 01 10:32:24 crc kubenswrapper[4867]: I1201 10:32:24.336528 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r6wmp_62ccb15d-a0d6-4799-87e4-99cf2489fa16/extract-content/0.log" Dec 01 10:32:24 crc kubenswrapper[4867]: I1201 10:32:24.367134 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r6wmp_62ccb15d-a0d6-4799-87e4-99cf2489fa16/extract-content/0.log" Dec 01 10:32:24 crc kubenswrapper[4867]: I1201 10:32:24.561642 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r6wmp_62ccb15d-a0d6-4799-87e4-99cf2489fa16/extract-content/0.log" Dec 01 10:32:24 crc kubenswrapper[4867]: I1201 10:32:24.562773 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r6wmp_62ccb15d-a0d6-4799-87e4-99cf2489fa16/extract-utilities/0.log" Dec 01 10:32:24 crc kubenswrapper[4867]: I1201 10:32:24.838414 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8x9hh_6c6c316b-bbb7-4e56-bced-aed519dec778/extract-utilities/0.log" Dec 01 10:32:25 crc kubenswrapper[4867]: I1201 10:32:25.114475 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8x9hh_6c6c316b-bbb7-4e56-bced-aed519dec778/extract-utilities/0.log" Dec 01 10:32:25 crc kubenswrapper[4867]: I1201 10:32:25.174992 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8x9hh_6c6c316b-bbb7-4e56-bced-aed519dec778/extract-content/0.log" Dec 01 10:32:25 crc kubenswrapper[4867]: I1201 10:32:25.184027 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8x9hh_6c6c316b-bbb7-4e56-bced-aed519dec778/extract-content/0.log" Dec 01 10:32:25 crc kubenswrapper[4867]: I1201 10:32:25.196750 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r6wmp_62ccb15d-a0d6-4799-87e4-99cf2489fa16/registry-server/0.log" Dec 01 10:32:25 crc kubenswrapper[4867]: I1201 10:32:25.376442 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8x9hh_6c6c316b-bbb7-4e56-bced-aed519dec778/extract-content/0.log" Dec 01 10:32:25 crc kubenswrapper[4867]: I1201 10:32:25.419026 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8x9hh_6c6c316b-bbb7-4e56-bced-aed519dec778/extract-utilities/0.log" Dec 01 10:32:25 crc kubenswrapper[4867]: I1201 10:32:25.613217 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8x9hh_6c6c316b-bbb7-4e56-bced-aed519dec778/registry-server/0.log" Dec 01 10:32:25 crc kubenswrapper[4867]: I1201 10:32:25.701425 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7gpdj_a222161f-afcc-47dc-bc2f-50b228543866/marketplace-operator/0.log" Dec 01 10:32:25 crc kubenswrapper[4867]: I1201 10:32:25.787801 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qqrrr_899e126d-0b32-4d48-b5c4-acc83cea5de4/extract-utilities/0.log" Dec 01 10:32:25 crc kubenswrapper[4867]: I1201 10:32:25.992629 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qqrrr_899e126d-0b32-4d48-b5c4-acc83cea5de4/extract-content/0.log" Dec 01 10:32:26 crc kubenswrapper[4867]: I1201 10:32:26.006432 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qqrrr_899e126d-0b32-4d48-b5c4-acc83cea5de4/extract-utilities/0.log" Dec 01 10:32:26 crc kubenswrapper[4867]: I1201 10:32:26.020221 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qqrrr_899e126d-0b32-4d48-b5c4-acc83cea5de4/extract-content/0.log" Dec 01 10:32:26 crc kubenswrapper[4867]: I1201 10:32:26.265534 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qqrrr_899e126d-0b32-4d48-b5c4-acc83cea5de4/extract-utilities/0.log" Dec 01 10:32:26 crc kubenswrapper[4867]: I1201 10:32:26.294743 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qqrrr_899e126d-0b32-4d48-b5c4-acc83cea5de4/extract-content/0.log" Dec 01 10:32:26 crc kubenswrapper[4867]: I1201 10:32:26.432007 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qqrrr_899e126d-0b32-4d48-b5c4-acc83cea5de4/registry-server/0.log" Dec 01 10:32:26 crc kubenswrapper[4867]: I1201 10:32:26.547058 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-54smd_a7c3f527-e2e1-4a92-b2dc-76cb294f84a6/extract-utilities/0.log" Dec 01 10:32:26 crc kubenswrapper[4867]: I1201 10:32:26.707832 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-54smd_a7c3f527-e2e1-4a92-b2dc-76cb294f84a6/extract-utilities/0.log" Dec 01 10:32:26 crc kubenswrapper[4867]: I1201 10:32:26.836709 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-54smd_a7c3f527-e2e1-4a92-b2dc-76cb294f84a6/extract-content/0.log" Dec 01 10:32:26 crc kubenswrapper[4867]: I1201 10:32:26.864468 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-54smd_a7c3f527-e2e1-4a92-b2dc-76cb294f84a6/extract-content/0.log" Dec 01 10:32:26 crc kubenswrapper[4867]: I1201 10:32:26.958312 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-54smd_a7c3f527-e2e1-4a92-b2dc-76cb294f84a6/extract-utilities/0.log" Dec 01 10:32:26 crc kubenswrapper[4867]: I1201 10:32:26.964093 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-54smd_a7c3f527-e2e1-4a92-b2dc-76cb294f84a6/extract-content/0.log" Dec 01 10:32:27 crc kubenswrapper[4867]: I1201 10:32:27.211940 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-54smd_a7c3f527-e2e1-4a92-b2dc-76cb294f84a6/registry-server/0.log" Dec 01 10:32:32 crc kubenswrapper[4867]: I1201 10:32:32.826944 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:32:32 crc kubenswrapper[4867]: E1201 10:32:32.828591 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:32:47 crc kubenswrapper[4867]: I1201 10:32:47.827125 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:32:47 crc kubenswrapper[4867]: E1201 10:32:47.828121 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:33:02 crc kubenswrapper[4867]: I1201 10:33:02.827620 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:33:02 crc kubenswrapper[4867]: E1201 10:33:02.828656 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:33:17 crc kubenswrapper[4867]: I1201 10:33:17.826691 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:33:17 crc kubenswrapper[4867]: E1201 10:33:17.827551 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:33:30 crc kubenswrapper[4867]: I1201 10:33:30.827533 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:33:30 crc kubenswrapper[4867]: E1201 10:33:30.828578 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:33:42 crc kubenswrapper[4867]: I1201 10:33:42.827830 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:33:42 crc kubenswrapper[4867]: E1201 10:33:42.829431 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:33:54 crc kubenswrapper[4867]: I1201 10:33:54.827197 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:33:54 crc kubenswrapper[4867]: E1201 10:33:54.830334 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:34:07 crc kubenswrapper[4867]: I1201 10:34:07.828335 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:34:07 crc kubenswrapper[4867]: E1201 10:34:07.829276 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:34:20 crc kubenswrapper[4867]: I1201 10:34:20.827162 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:34:20 crc kubenswrapper[4867]: E1201 10:34:20.828284 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:34:32 crc kubenswrapper[4867]: I1201 10:34:32.827798 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:34:32 crc kubenswrapper[4867]: E1201 10:34:32.828508 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:34:45 crc kubenswrapper[4867]: I1201 10:34:45.828423 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:34:45 crc kubenswrapper[4867]: E1201 10:34:45.829102 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:34:48 crc kubenswrapper[4867]: I1201 10:34:48.473164 4867 generic.go:334] "Generic (PLEG): container finished" podID="085eeb7f-e0e8-4677-a788-4c1ba9fba5f0" containerID="e60205b6321c9ed622744c00e57010e45a49887c0d2b638850971c7ffb5f5c2f" exitCode=0 Dec 01 10:34:48 crc kubenswrapper[4867]: I1201 10:34:48.473243 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j9btd/must-gather-cgj5v" event={"ID":"085eeb7f-e0e8-4677-a788-4c1ba9fba5f0","Type":"ContainerDied","Data":"e60205b6321c9ed622744c00e57010e45a49887c0d2b638850971c7ffb5f5c2f"} Dec 01 10:34:48 crc kubenswrapper[4867]: I1201 10:34:48.475205 4867 scope.go:117] "RemoveContainer" containerID="e60205b6321c9ed622744c00e57010e45a49887c0d2b638850971c7ffb5f5c2f" Dec 01 10:34:48 crc kubenswrapper[4867]: I1201 10:34:48.931943 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-j9btd_must-gather-cgj5v_085eeb7f-e0e8-4677-a788-4c1ba9fba5f0/gather/0.log" Dec 01 10:34:57 crc kubenswrapper[4867]: I1201 10:34:57.646031 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-j9btd/must-gather-cgj5v"] Dec 01 10:34:57 crc kubenswrapper[4867]: I1201 10:34:57.646747 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-j9btd/must-gather-cgj5v" podUID="085eeb7f-e0e8-4677-a788-4c1ba9fba5f0" containerName="copy" containerID="cri-o://4fd34707c2d0b9d1c7091d6e1521aeacc2feea348c9d07360cc48c6814dd70d9" gracePeriod=2 Dec 01 10:34:57 crc kubenswrapper[4867]: I1201 10:34:57.658352 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-j9btd/must-gather-cgj5v"] Dec 01 10:34:58 crc kubenswrapper[4867]: I1201 10:34:58.135514 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-j9btd_must-gather-cgj5v_085eeb7f-e0e8-4677-a788-4c1ba9fba5f0/copy/0.log" Dec 01 10:34:58 crc kubenswrapper[4867]: I1201 10:34:58.136352 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j9btd/must-gather-cgj5v" Dec 01 10:34:58 crc kubenswrapper[4867]: I1201 10:34:58.315010 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p4ds\" (UniqueName: \"kubernetes.io/projected/085eeb7f-e0e8-4677-a788-4c1ba9fba5f0-kube-api-access-5p4ds\") pod \"085eeb7f-e0e8-4677-a788-4c1ba9fba5f0\" (UID: \"085eeb7f-e0e8-4677-a788-4c1ba9fba5f0\") " Dec 01 10:34:58 crc kubenswrapper[4867]: I1201 10:34:58.315196 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/085eeb7f-e0e8-4677-a788-4c1ba9fba5f0-must-gather-output\") pod \"085eeb7f-e0e8-4677-a788-4c1ba9fba5f0\" (UID: \"085eeb7f-e0e8-4677-a788-4c1ba9fba5f0\") " Dec 01 10:34:58 crc kubenswrapper[4867]: I1201 10:34:58.321241 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/085eeb7f-e0e8-4677-a788-4c1ba9fba5f0-kube-api-access-5p4ds" (OuterVolumeSpecName: "kube-api-access-5p4ds") pod "085eeb7f-e0e8-4677-a788-4c1ba9fba5f0" (UID: "085eeb7f-e0e8-4677-a788-4c1ba9fba5f0"). InnerVolumeSpecName "kube-api-access-5p4ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:58 crc kubenswrapper[4867]: I1201 10:34:58.417853 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p4ds\" (UniqueName: \"kubernetes.io/projected/085eeb7f-e0e8-4677-a788-4c1ba9fba5f0-kube-api-access-5p4ds\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:58 crc kubenswrapper[4867]: I1201 10:34:58.495348 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/085eeb7f-e0e8-4677-a788-4c1ba9fba5f0-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "085eeb7f-e0e8-4677-a788-4c1ba9fba5f0" (UID: "085eeb7f-e0e8-4677-a788-4c1ba9fba5f0"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:58 crc kubenswrapper[4867]: I1201 10:34:58.519915 4867 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/085eeb7f-e0e8-4677-a788-4c1ba9fba5f0-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:58 crc kubenswrapper[4867]: I1201 10:34:58.570132 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-j9btd_must-gather-cgj5v_085eeb7f-e0e8-4677-a788-4c1ba9fba5f0/copy/0.log" Dec 01 10:34:58 crc kubenswrapper[4867]: I1201 10:34:58.570587 4867 generic.go:334] "Generic (PLEG): container finished" podID="085eeb7f-e0e8-4677-a788-4c1ba9fba5f0" containerID="4fd34707c2d0b9d1c7091d6e1521aeacc2feea348c9d07360cc48c6814dd70d9" exitCode=143 Dec 01 10:34:58 crc kubenswrapper[4867]: I1201 10:34:58.570646 4867 scope.go:117] "RemoveContainer" containerID="4fd34707c2d0b9d1c7091d6e1521aeacc2feea348c9d07360cc48c6814dd70d9" Dec 01 10:34:58 crc kubenswrapper[4867]: I1201 10:34:58.570656 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j9btd/must-gather-cgj5v" Dec 01 10:34:58 crc kubenswrapper[4867]: I1201 10:34:58.604204 4867 scope.go:117] "RemoveContainer" containerID="e60205b6321c9ed622744c00e57010e45a49887c0d2b638850971c7ffb5f5c2f" Dec 01 10:34:58 crc kubenswrapper[4867]: I1201 10:34:58.665417 4867 scope.go:117] "RemoveContainer" containerID="4fd34707c2d0b9d1c7091d6e1521aeacc2feea348c9d07360cc48c6814dd70d9" Dec 01 10:34:58 crc kubenswrapper[4867]: E1201 10:34:58.666610 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fd34707c2d0b9d1c7091d6e1521aeacc2feea348c9d07360cc48c6814dd70d9\": container with ID starting with 4fd34707c2d0b9d1c7091d6e1521aeacc2feea348c9d07360cc48c6814dd70d9 not found: ID does not exist" containerID="4fd34707c2d0b9d1c7091d6e1521aeacc2feea348c9d07360cc48c6814dd70d9" Dec 01 10:34:58 crc kubenswrapper[4867]: I1201 10:34:58.666872 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fd34707c2d0b9d1c7091d6e1521aeacc2feea348c9d07360cc48c6814dd70d9"} err="failed to get container status \"4fd34707c2d0b9d1c7091d6e1521aeacc2feea348c9d07360cc48c6814dd70d9\": rpc error: code = NotFound desc = could not find container \"4fd34707c2d0b9d1c7091d6e1521aeacc2feea348c9d07360cc48c6814dd70d9\": container with ID starting with 4fd34707c2d0b9d1c7091d6e1521aeacc2feea348c9d07360cc48c6814dd70d9 not found: ID does not exist" Dec 01 10:34:58 crc kubenswrapper[4867]: I1201 10:34:58.666901 4867 scope.go:117] "RemoveContainer" containerID="e60205b6321c9ed622744c00e57010e45a49887c0d2b638850971c7ffb5f5c2f" Dec 01 10:34:58 crc kubenswrapper[4867]: E1201 10:34:58.667160 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e60205b6321c9ed622744c00e57010e45a49887c0d2b638850971c7ffb5f5c2f\": container with ID starting with e60205b6321c9ed622744c00e57010e45a49887c0d2b638850971c7ffb5f5c2f not found: ID does not exist" containerID="e60205b6321c9ed622744c00e57010e45a49887c0d2b638850971c7ffb5f5c2f" Dec 01 10:34:58 crc kubenswrapper[4867]: I1201 10:34:58.667186 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e60205b6321c9ed622744c00e57010e45a49887c0d2b638850971c7ffb5f5c2f"} err="failed to get container status \"e60205b6321c9ed622744c00e57010e45a49887c0d2b638850971c7ffb5f5c2f\": rpc error: code = NotFound desc = could not find container \"e60205b6321c9ed622744c00e57010e45a49887c0d2b638850971c7ffb5f5c2f\": container with ID starting with e60205b6321c9ed622744c00e57010e45a49887c0d2b638850971c7ffb5f5c2f not found: ID does not exist" Dec 01 10:34:58 crc kubenswrapper[4867]: I1201 10:34:58.840608 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="085eeb7f-e0e8-4677-a788-4c1ba9fba5f0" path="/var/lib/kubelet/pods/085eeb7f-e0e8-4677-a788-4c1ba9fba5f0/volumes" Dec 01 10:34:59 crc kubenswrapper[4867]: I1201 10:34:59.827957 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:34:59 crc kubenswrapper[4867]: E1201 10:34:59.829010 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:35:05 crc kubenswrapper[4867]: I1201 10:35:05.414724 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fx2b9"] Dec 01 10:35:05 crc kubenswrapper[4867]: E1201 10:35:05.415662 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c517207-fab3-441a-88f4-377be0659799" containerName="collect-profiles" Dec 01 10:35:05 crc kubenswrapper[4867]: I1201 10:35:05.415675 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c517207-fab3-441a-88f4-377be0659799" containerName="collect-profiles" Dec 01 10:35:05 crc kubenswrapper[4867]: E1201 10:35:05.415704 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="085eeb7f-e0e8-4677-a788-4c1ba9fba5f0" containerName="gather" Dec 01 10:35:05 crc kubenswrapper[4867]: I1201 10:35:05.415710 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="085eeb7f-e0e8-4677-a788-4c1ba9fba5f0" containerName="gather" Dec 01 10:35:05 crc kubenswrapper[4867]: E1201 10:35:05.415732 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="085eeb7f-e0e8-4677-a788-4c1ba9fba5f0" containerName="copy" Dec 01 10:35:05 crc kubenswrapper[4867]: I1201 10:35:05.415738 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="085eeb7f-e0e8-4677-a788-4c1ba9fba5f0" containerName="copy" Dec 01 10:35:05 crc kubenswrapper[4867]: I1201 10:35:05.415920 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="085eeb7f-e0e8-4677-a788-4c1ba9fba5f0" containerName="copy" Dec 01 10:35:05 crc kubenswrapper[4867]: I1201 10:35:05.415932 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="085eeb7f-e0e8-4677-a788-4c1ba9fba5f0" containerName="gather" Dec 01 10:35:05 crc kubenswrapper[4867]: I1201 10:35:05.415948 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c517207-fab3-441a-88f4-377be0659799" containerName="collect-profiles" Dec 01 10:35:05 crc kubenswrapper[4867]: I1201 10:35:05.417203 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fx2b9" Dec 01 10:35:05 crc kubenswrapper[4867]: I1201 10:35:05.426267 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fx2b9"] Dec 01 10:35:05 crc kubenswrapper[4867]: I1201 10:35:05.548879 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l27hj\" (UniqueName: \"kubernetes.io/projected/88b8930f-d497-4791-8c9f-0da2fbcf49e1-kube-api-access-l27hj\") pod \"community-operators-fx2b9\" (UID: \"88b8930f-d497-4791-8c9f-0da2fbcf49e1\") " pod="openshift-marketplace/community-operators-fx2b9" Dec 01 10:35:05 crc kubenswrapper[4867]: I1201 10:35:05.549004 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88b8930f-d497-4791-8c9f-0da2fbcf49e1-catalog-content\") pod \"community-operators-fx2b9\" (UID: \"88b8930f-d497-4791-8c9f-0da2fbcf49e1\") " pod="openshift-marketplace/community-operators-fx2b9" Dec 01 10:35:05 crc kubenswrapper[4867]: I1201 10:35:05.549170 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88b8930f-d497-4791-8c9f-0da2fbcf49e1-utilities\") pod \"community-operators-fx2b9\" (UID: \"88b8930f-d497-4791-8c9f-0da2fbcf49e1\") " pod="openshift-marketplace/community-operators-fx2b9" Dec 01 10:35:05 crc kubenswrapper[4867]: I1201 10:35:05.651359 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l27hj\" (UniqueName: \"kubernetes.io/projected/88b8930f-d497-4791-8c9f-0da2fbcf49e1-kube-api-access-l27hj\") pod \"community-operators-fx2b9\" (UID: \"88b8930f-d497-4791-8c9f-0da2fbcf49e1\") " pod="openshift-marketplace/community-operators-fx2b9" Dec 01 10:35:05 crc kubenswrapper[4867]: I1201 10:35:05.651502 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88b8930f-d497-4791-8c9f-0da2fbcf49e1-catalog-content\") pod \"community-operators-fx2b9\" (UID: \"88b8930f-d497-4791-8c9f-0da2fbcf49e1\") " pod="openshift-marketplace/community-operators-fx2b9" Dec 01 10:35:05 crc kubenswrapper[4867]: I1201 10:35:05.651564 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88b8930f-d497-4791-8c9f-0da2fbcf49e1-utilities\") pod \"community-operators-fx2b9\" (UID: \"88b8930f-d497-4791-8c9f-0da2fbcf49e1\") " pod="openshift-marketplace/community-operators-fx2b9" Dec 01 10:35:05 crc kubenswrapper[4867]: I1201 10:35:05.656262 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88b8930f-d497-4791-8c9f-0da2fbcf49e1-utilities\") pod \"community-operators-fx2b9\" (UID: \"88b8930f-d497-4791-8c9f-0da2fbcf49e1\") " pod="openshift-marketplace/community-operators-fx2b9" Dec 01 10:35:05 crc kubenswrapper[4867]: I1201 10:35:05.657097 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88b8930f-d497-4791-8c9f-0da2fbcf49e1-catalog-content\") pod \"community-operators-fx2b9\" (UID: \"88b8930f-d497-4791-8c9f-0da2fbcf49e1\") " pod="openshift-marketplace/community-operators-fx2b9" Dec 01 10:35:05 crc kubenswrapper[4867]: I1201 10:35:05.686168 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l27hj\" (UniqueName: \"kubernetes.io/projected/88b8930f-d497-4791-8c9f-0da2fbcf49e1-kube-api-access-l27hj\") pod \"community-operators-fx2b9\" (UID: \"88b8930f-d497-4791-8c9f-0da2fbcf49e1\") " pod="openshift-marketplace/community-operators-fx2b9" Dec 01 10:35:05 crc kubenswrapper[4867]: I1201 10:35:05.746260 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fx2b9" Dec 01 10:35:06 crc kubenswrapper[4867]: I1201 10:35:06.182340 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fx2b9"] Dec 01 10:35:06 crc kubenswrapper[4867]: I1201 10:35:06.650160 4867 generic.go:334] "Generic (PLEG): container finished" podID="88b8930f-d497-4791-8c9f-0da2fbcf49e1" containerID="7bba81d3ac2375b5c453d095fe8ba7bde4fe3f213e2328bed53e22c786834d26" exitCode=0 Dec 01 10:35:06 crc kubenswrapper[4867]: I1201 10:35:06.650224 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fx2b9" event={"ID":"88b8930f-d497-4791-8c9f-0da2fbcf49e1","Type":"ContainerDied","Data":"7bba81d3ac2375b5c453d095fe8ba7bde4fe3f213e2328bed53e22c786834d26"} Dec 01 10:35:06 crc kubenswrapper[4867]: I1201 10:35:06.650415 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fx2b9" event={"ID":"88b8930f-d497-4791-8c9f-0da2fbcf49e1","Type":"ContainerStarted","Data":"0f572704621bda471fd644307120efa98a5e801aa5e98c19acb476effdaace2c"} Dec 01 10:35:06 crc kubenswrapper[4867]: I1201 10:35:06.652012 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:35:07 crc kubenswrapper[4867]: I1201 10:35:07.661903 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fx2b9" event={"ID":"88b8930f-d497-4791-8c9f-0da2fbcf49e1","Type":"ContainerStarted","Data":"507696a61163e077d5e585155caaf7d7167f1ff975116b822f6ded20f066aee6"} Dec 01 10:35:08 crc kubenswrapper[4867]: I1201 10:35:08.672666 4867 generic.go:334] "Generic (PLEG): container finished" podID="88b8930f-d497-4791-8c9f-0da2fbcf49e1" containerID="507696a61163e077d5e585155caaf7d7167f1ff975116b822f6ded20f066aee6" exitCode=0 Dec 01 10:35:08 crc kubenswrapper[4867]: I1201 10:35:08.672714 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fx2b9" event={"ID":"88b8930f-d497-4791-8c9f-0da2fbcf49e1","Type":"ContainerDied","Data":"507696a61163e077d5e585155caaf7d7167f1ff975116b822f6ded20f066aee6"} Dec 01 10:35:09 crc kubenswrapper[4867]: I1201 10:35:09.684068 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fx2b9" event={"ID":"88b8930f-d497-4791-8c9f-0da2fbcf49e1","Type":"ContainerStarted","Data":"7fa06f4e2bb51dfde4e6f118ed8803ef97d8901d029a9b17e66d4b1741a7ca44"} Dec 01 10:35:09 crc kubenswrapper[4867]: I1201 10:35:09.705075 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fx2b9" podStartSLOduration=2.053765541 podStartE2EDuration="4.70505625s" podCreationTimestamp="2025-12-01 10:35:05 +0000 UTC" firstStartedPulling="2025-12-01 10:35:06.65178108 +0000 UTC m=+5228.111167834" lastFinishedPulling="2025-12-01 10:35:09.303071779 +0000 UTC m=+5230.762458543" observedRunningTime="2025-12-01 10:35:09.698450859 +0000 UTC m=+5231.157837643" watchObservedRunningTime="2025-12-01 10:35:09.70505625 +0000 UTC m=+5231.164443004" Dec 01 10:35:14 crc kubenswrapper[4867]: I1201 10:35:14.827622 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:35:14 crc kubenswrapper[4867]: E1201 10:35:14.828358 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:35:15 crc kubenswrapper[4867]: I1201 10:35:15.748489 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fx2b9" Dec 01 10:35:15 crc kubenswrapper[4867]: I1201 10:35:15.749101 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fx2b9" Dec 01 10:35:15 crc kubenswrapper[4867]: I1201 10:35:15.815998 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fx2b9" Dec 01 10:35:16 crc kubenswrapper[4867]: I1201 10:35:16.814517 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fx2b9" Dec 01 10:35:16 crc kubenswrapper[4867]: I1201 10:35:16.883050 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fx2b9"] Dec 01 10:35:18 crc kubenswrapper[4867]: I1201 10:35:18.764233 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fx2b9" podUID="88b8930f-d497-4791-8c9f-0da2fbcf49e1" containerName="registry-server" containerID="cri-o://7fa06f4e2bb51dfde4e6f118ed8803ef97d8901d029a9b17e66d4b1741a7ca44" gracePeriod=2 Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.708539 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fx2b9" Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.775855 4867 generic.go:334] "Generic (PLEG): container finished" podID="88b8930f-d497-4791-8c9f-0da2fbcf49e1" containerID="7fa06f4e2bb51dfde4e6f118ed8803ef97d8901d029a9b17e66d4b1741a7ca44" exitCode=0 Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.775893 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fx2b9" event={"ID":"88b8930f-d497-4791-8c9f-0da2fbcf49e1","Type":"ContainerDied","Data":"7fa06f4e2bb51dfde4e6f118ed8803ef97d8901d029a9b17e66d4b1741a7ca44"} Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.775917 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fx2b9" event={"ID":"88b8930f-d497-4791-8c9f-0da2fbcf49e1","Type":"ContainerDied","Data":"0f572704621bda471fd644307120efa98a5e801aa5e98c19acb476effdaace2c"} Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.775933 4867 scope.go:117] "RemoveContainer" containerID="7fa06f4e2bb51dfde4e6f118ed8803ef97d8901d029a9b17e66d4b1741a7ca44" Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.776053 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fx2b9" Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.798486 4867 scope.go:117] "RemoveContainer" containerID="507696a61163e077d5e585155caaf7d7167f1ff975116b822f6ded20f066aee6" Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.820147 4867 scope.go:117] "RemoveContainer" containerID="7bba81d3ac2375b5c453d095fe8ba7bde4fe3f213e2328bed53e22c786834d26" Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.830214 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88b8930f-d497-4791-8c9f-0da2fbcf49e1-utilities\") pod \"88b8930f-d497-4791-8c9f-0da2fbcf49e1\" (UID: \"88b8930f-d497-4791-8c9f-0da2fbcf49e1\") " Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.830503 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88b8930f-d497-4791-8c9f-0da2fbcf49e1-catalog-content\") pod \"88b8930f-d497-4791-8c9f-0da2fbcf49e1\" (UID: \"88b8930f-d497-4791-8c9f-0da2fbcf49e1\") " Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.830602 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l27hj\" (UniqueName: \"kubernetes.io/projected/88b8930f-d497-4791-8c9f-0da2fbcf49e1-kube-api-access-l27hj\") pod \"88b8930f-d497-4791-8c9f-0da2fbcf49e1\" (UID: \"88b8930f-d497-4791-8c9f-0da2fbcf49e1\") " Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.831191 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b8930f-d497-4791-8c9f-0da2fbcf49e1-utilities" (OuterVolumeSpecName: "utilities") pod "88b8930f-d497-4791-8c9f-0da2fbcf49e1" (UID: "88b8930f-d497-4791-8c9f-0da2fbcf49e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.839675 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88b8930f-d497-4791-8c9f-0da2fbcf49e1-kube-api-access-l27hj" (OuterVolumeSpecName: "kube-api-access-l27hj") pod "88b8930f-d497-4791-8c9f-0da2fbcf49e1" (UID: "88b8930f-d497-4791-8c9f-0da2fbcf49e1"). InnerVolumeSpecName "kube-api-access-l27hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.881702 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b8930f-d497-4791-8c9f-0da2fbcf49e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88b8930f-d497-4791-8c9f-0da2fbcf49e1" (UID: "88b8930f-d497-4791-8c9f-0da2fbcf49e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.905560 4867 scope.go:117] "RemoveContainer" containerID="7fa06f4e2bb51dfde4e6f118ed8803ef97d8901d029a9b17e66d4b1741a7ca44" Dec 01 10:35:19 crc kubenswrapper[4867]: E1201 10:35:19.906002 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fa06f4e2bb51dfde4e6f118ed8803ef97d8901d029a9b17e66d4b1741a7ca44\": container with ID starting with 7fa06f4e2bb51dfde4e6f118ed8803ef97d8901d029a9b17e66d4b1741a7ca44 not found: ID does not exist" containerID="7fa06f4e2bb51dfde4e6f118ed8803ef97d8901d029a9b17e66d4b1741a7ca44" Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.906165 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fa06f4e2bb51dfde4e6f118ed8803ef97d8901d029a9b17e66d4b1741a7ca44"} err="failed to get container status \"7fa06f4e2bb51dfde4e6f118ed8803ef97d8901d029a9b17e66d4b1741a7ca44\": rpc error: code = NotFound desc = could not find container \"7fa06f4e2bb51dfde4e6f118ed8803ef97d8901d029a9b17e66d4b1741a7ca44\": container with ID starting with 7fa06f4e2bb51dfde4e6f118ed8803ef97d8901d029a9b17e66d4b1741a7ca44 not found: ID does not exist" Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.906356 4867 scope.go:117] "RemoveContainer" containerID="507696a61163e077d5e585155caaf7d7167f1ff975116b822f6ded20f066aee6" Dec 01 10:35:19 crc kubenswrapper[4867]: E1201 10:35:19.906788 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"507696a61163e077d5e585155caaf7d7167f1ff975116b822f6ded20f066aee6\": container with ID starting with 507696a61163e077d5e585155caaf7d7167f1ff975116b822f6ded20f066aee6 not found: ID does not exist" containerID="507696a61163e077d5e585155caaf7d7167f1ff975116b822f6ded20f066aee6" Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.906828 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"507696a61163e077d5e585155caaf7d7167f1ff975116b822f6ded20f066aee6"} err="failed to get container status \"507696a61163e077d5e585155caaf7d7167f1ff975116b822f6ded20f066aee6\": rpc error: code = NotFound desc = could not find container \"507696a61163e077d5e585155caaf7d7167f1ff975116b822f6ded20f066aee6\": container with ID starting with 507696a61163e077d5e585155caaf7d7167f1ff975116b822f6ded20f066aee6 not found: ID does not exist" Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.906847 4867 scope.go:117] "RemoveContainer" containerID="7bba81d3ac2375b5c453d095fe8ba7bde4fe3f213e2328bed53e22c786834d26" Dec 01 10:35:19 crc kubenswrapper[4867]: E1201 10:35:19.907057 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bba81d3ac2375b5c453d095fe8ba7bde4fe3f213e2328bed53e22c786834d26\": container with ID starting with 7bba81d3ac2375b5c453d095fe8ba7bde4fe3f213e2328bed53e22c786834d26 not found: ID does not exist" containerID="7bba81d3ac2375b5c453d095fe8ba7bde4fe3f213e2328bed53e22c786834d26" Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.907078 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bba81d3ac2375b5c453d095fe8ba7bde4fe3f213e2328bed53e22c786834d26"} err="failed to get container status \"7bba81d3ac2375b5c453d095fe8ba7bde4fe3f213e2328bed53e22c786834d26\": rpc error: code = NotFound desc = could not find container \"7bba81d3ac2375b5c453d095fe8ba7bde4fe3f213e2328bed53e22c786834d26\": container with ID starting with 7bba81d3ac2375b5c453d095fe8ba7bde4fe3f213e2328bed53e22c786834d26 not found: ID does not exist" Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.932744 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88b8930f-d497-4791-8c9f-0da2fbcf49e1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.932777 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l27hj\" (UniqueName: \"kubernetes.io/projected/88b8930f-d497-4791-8c9f-0da2fbcf49e1-kube-api-access-l27hj\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:19 crc kubenswrapper[4867]: I1201 10:35:19.932792 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88b8930f-d497-4791-8c9f-0da2fbcf49e1-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:20 crc kubenswrapper[4867]: I1201 10:35:20.116232 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fx2b9"] Dec 01 10:35:20 crc kubenswrapper[4867]: I1201 10:35:20.134263 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fx2b9"] Dec 01 10:35:20 crc kubenswrapper[4867]: I1201 10:35:20.843868 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88b8930f-d497-4791-8c9f-0da2fbcf49e1" path="/var/lib/kubelet/pods/88b8930f-d497-4791-8c9f-0da2fbcf49e1/volumes" Dec 01 10:35:28 crc kubenswrapper[4867]: I1201 10:35:28.832495 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:35:28 crc kubenswrapper[4867]: E1201 10:35:28.833205 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:35:39 crc kubenswrapper[4867]: I1201 10:35:39.827218 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:35:39 crc kubenswrapper[4867]: E1201 10:35:39.828001 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:35:50 crc kubenswrapper[4867]: I1201 10:35:50.827253 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:35:50 crc kubenswrapper[4867]: E1201 10:35:50.828175 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:36:04 crc kubenswrapper[4867]: I1201 10:36:04.827260 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:36:04 crc kubenswrapper[4867]: E1201 10:36:04.828108 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:36:16 crc kubenswrapper[4867]: I1201 10:36:16.826714 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:36:16 crc kubenswrapper[4867]: E1201 10:36:16.827540 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:36:23 crc kubenswrapper[4867]: I1201 10:36:23.034296 4867 scope.go:117] "RemoveContainer" containerID="bbeb71490ef30a7f3298901d003dca10d62fd1e19ef7d09c8f154e82c2fa643f" Dec 01 10:36:30 crc kubenswrapper[4867]: I1201 10:36:30.828056 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:36:31 crc kubenswrapper[4867]: I1201 10:36:31.488155 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"7eade69509a6bd7bbb2082e52b0943889c75068ffe4ca36fc1f8c12b2a2b8834"} Dec 01 10:36:52 crc kubenswrapper[4867]: I1201 10:36:52.556025 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f7gn7"] Dec 01 10:36:52 crc kubenswrapper[4867]: E1201 10:36:52.565573 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b8930f-d497-4791-8c9f-0da2fbcf49e1" containerName="extract-utilities" Dec 01 10:36:52 crc kubenswrapper[4867]: I1201 10:36:52.565596 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b8930f-d497-4791-8c9f-0da2fbcf49e1" containerName="extract-utilities" Dec 01 10:36:52 crc kubenswrapper[4867]: E1201 10:36:52.565640 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b8930f-d497-4791-8c9f-0da2fbcf49e1" containerName="registry-server" Dec 01 10:36:52 crc kubenswrapper[4867]: I1201 10:36:52.565650 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b8930f-d497-4791-8c9f-0da2fbcf49e1" containerName="registry-server" Dec 01 10:36:52 crc kubenswrapper[4867]: E1201 10:36:52.565667 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b8930f-d497-4791-8c9f-0da2fbcf49e1" containerName="extract-content" Dec 01 10:36:52 crc kubenswrapper[4867]: I1201 10:36:52.565675 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b8930f-d497-4791-8c9f-0da2fbcf49e1" containerName="extract-content" Dec 01 10:36:52 crc kubenswrapper[4867]: I1201 10:36:52.565959 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="88b8930f-d497-4791-8c9f-0da2fbcf49e1" containerName="registry-server" Dec 01 10:36:52 crc kubenswrapper[4867]: I1201 10:36:52.568645 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f7gn7"] Dec 01 10:36:52 crc kubenswrapper[4867]: I1201 10:36:52.568778 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7gn7" Dec 01 10:36:52 crc kubenswrapper[4867]: I1201 10:36:52.762403 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b309016b-5997-4306-a687-9f8e0342ae48-utilities\") pod \"certified-operators-f7gn7\" (UID: \"b309016b-5997-4306-a687-9f8e0342ae48\") " pod="openshift-marketplace/certified-operators-f7gn7" Dec 01 10:36:52 crc kubenswrapper[4867]: I1201 10:36:52.762495 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b309016b-5997-4306-a687-9f8e0342ae48-catalog-content\") pod \"certified-operators-f7gn7\" (UID: \"b309016b-5997-4306-a687-9f8e0342ae48\") " pod="openshift-marketplace/certified-operators-f7gn7" Dec 01 10:36:52 crc kubenswrapper[4867]: I1201 10:36:52.762536 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d62xk\" (UniqueName: \"kubernetes.io/projected/b309016b-5997-4306-a687-9f8e0342ae48-kube-api-access-d62xk\") pod \"certified-operators-f7gn7\" (UID: \"b309016b-5997-4306-a687-9f8e0342ae48\") " pod="openshift-marketplace/certified-operators-f7gn7" Dec 01 10:36:52 crc kubenswrapper[4867]: I1201 10:36:52.863923 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b309016b-5997-4306-a687-9f8e0342ae48-utilities\") pod \"certified-operators-f7gn7\" (UID: \"b309016b-5997-4306-a687-9f8e0342ae48\") " pod="openshift-marketplace/certified-operators-f7gn7" Dec 01 10:36:52 crc kubenswrapper[4867]: I1201 10:36:52.864000 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b309016b-5997-4306-a687-9f8e0342ae48-catalog-content\") pod \"certified-operators-f7gn7\" (UID: \"b309016b-5997-4306-a687-9f8e0342ae48\") " pod="openshift-marketplace/certified-operators-f7gn7" Dec 01 10:36:52 crc kubenswrapper[4867]: I1201 10:36:52.864030 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d62xk\" (UniqueName: \"kubernetes.io/projected/b309016b-5997-4306-a687-9f8e0342ae48-kube-api-access-d62xk\") pod \"certified-operators-f7gn7\" (UID: \"b309016b-5997-4306-a687-9f8e0342ae48\") " pod="openshift-marketplace/certified-operators-f7gn7" Dec 01 10:36:52 crc kubenswrapper[4867]: I1201 10:36:52.864444 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b309016b-5997-4306-a687-9f8e0342ae48-utilities\") pod \"certified-operators-f7gn7\" (UID: \"b309016b-5997-4306-a687-9f8e0342ae48\") " pod="openshift-marketplace/certified-operators-f7gn7" Dec 01 10:36:52 crc kubenswrapper[4867]: I1201 10:36:52.864525 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b309016b-5997-4306-a687-9f8e0342ae48-catalog-content\") pod \"certified-operators-f7gn7\" (UID: \"b309016b-5997-4306-a687-9f8e0342ae48\") " pod="openshift-marketplace/certified-operators-f7gn7" Dec 01 10:36:52 crc kubenswrapper[4867]: I1201 10:36:52.898898 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d62xk\" (UniqueName: \"kubernetes.io/projected/b309016b-5997-4306-a687-9f8e0342ae48-kube-api-access-d62xk\") pod \"certified-operators-f7gn7\" (UID: \"b309016b-5997-4306-a687-9f8e0342ae48\") " pod="openshift-marketplace/certified-operators-f7gn7" Dec 01 10:36:53 crc kubenswrapper[4867]: I1201 10:36:53.193910 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7gn7" Dec 01 10:36:53 crc kubenswrapper[4867]: I1201 10:36:53.698652 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f7gn7"] Dec 01 10:36:53 crc kubenswrapper[4867]: I1201 10:36:53.729259 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7gn7" event={"ID":"b309016b-5997-4306-a687-9f8e0342ae48","Type":"ContainerStarted","Data":"579ee10f48309c67ebad062e8e74bf52c8e117c592371c201a6b405d39c9f116"} Dec 01 10:36:54 crc kubenswrapper[4867]: I1201 10:36:54.740877 4867 generic.go:334] "Generic (PLEG): container finished" podID="b309016b-5997-4306-a687-9f8e0342ae48" containerID="393841ab1c41d150887561b729bd31677dd0f2e38746d7aeacfe2cb4c6af7f31" exitCode=0 Dec 01 10:36:54 crc kubenswrapper[4867]: I1201 10:36:54.740955 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7gn7" event={"ID":"b309016b-5997-4306-a687-9f8e0342ae48","Type":"ContainerDied","Data":"393841ab1c41d150887561b729bd31677dd0f2e38746d7aeacfe2cb4c6af7f31"} Dec 01 10:36:56 crc kubenswrapper[4867]: I1201 10:36:56.762204 4867 generic.go:334] "Generic (PLEG): container finished" podID="b309016b-5997-4306-a687-9f8e0342ae48" containerID="fe434abbe91f818df3d112177502c2f9af211e231b593d40dcbcbb5c6b2ba311" exitCode=0 Dec 01 10:36:56 crc kubenswrapper[4867]: I1201 10:36:56.762287 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7gn7" event={"ID":"b309016b-5997-4306-a687-9f8e0342ae48","Type":"ContainerDied","Data":"fe434abbe91f818df3d112177502c2f9af211e231b593d40dcbcbb5c6b2ba311"} Dec 01 10:36:57 crc kubenswrapper[4867]: I1201 10:36:57.773263 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7gn7" event={"ID":"b309016b-5997-4306-a687-9f8e0342ae48","Type":"ContainerStarted","Data":"2ac286d34d4ddb3a4a43b57bc4846ac9571422d78bc5091d457fb475351334e5"} Dec 01 10:36:57 crc kubenswrapper[4867]: I1201 10:36:57.792034 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f7gn7" podStartSLOduration=3.331346894 podStartE2EDuration="5.792007466s" podCreationTimestamp="2025-12-01 10:36:52 +0000 UTC" firstStartedPulling="2025-12-01 10:36:54.743476366 +0000 UTC m=+5336.202863130" lastFinishedPulling="2025-12-01 10:36:57.204136948 +0000 UTC m=+5338.663523702" observedRunningTime="2025-12-01 10:36:57.79074238 +0000 UTC m=+5339.250129154" watchObservedRunningTime="2025-12-01 10:36:57.792007466 +0000 UTC m=+5339.251394220" Dec 01 10:37:03 crc kubenswrapper[4867]: I1201 10:37:03.194902 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f7gn7" Dec 01 10:37:03 crc kubenswrapper[4867]: I1201 10:37:03.196580 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f7gn7" Dec 01 10:37:03 crc kubenswrapper[4867]: I1201 10:37:03.251663 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f7gn7" Dec 01 10:37:03 crc kubenswrapper[4867]: I1201 10:37:03.874382 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f7gn7" Dec 01 10:37:03 crc kubenswrapper[4867]: I1201 10:37:03.934532 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f7gn7"] Dec 01 10:37:05 crc kubenswrapper[4867]: I1201 10:37:05.859107 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f7gn7" podUID="b309016b-5997-4306-a687-9f8e0342ae48" containerName="registry-server" containerID="cri-o://2ac286d34d4ddb3a4a43b57bc4846ac9571422d78bc5091d457fb475351334e5" gracePeriod=2 Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.346202 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7gn7" Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.454534 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b309016b-5997-4306-a687-9f8e0342ae48-catalog-content\") pod \"b309016b-5997-4306-a687-9f8e0342ae48\" (UID: \"b309016b-5997-4306-a687-9f8e0342ae48\") " Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.454590 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b309016b-5997-4306-a687-9f8e0342ae48-utilities\") pod \"b309016b-5997-4306-a687-9f8e0342ae48\" (UID: \"b309016b-5997-4306-a687-9f8e0342ae48\") " Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.454689 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d62xk\" (UniqueName: \"kubernetes.io/projected/b309016b-5997-4306-a687-9f8e0342ae48-kube-api-access-d62xk\") pod \"b309016b-5997-4306-a687-9f8e0342ae48\" (UID: \"b309016b-5997-4306-a687-9f8e0342ae48\") " Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.455608 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b309016b-5997-4306-a687-9f8e0342ae48-utilities" (OuterVolumeSpecName: "utilities") pod "b309016b-5997-4306-a687-9f8e0342ae48" (UID: "b309016b-5997-4306-a687-9f8e0342ae48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.460344 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b309016b-5997-4306-a687-9f8e0342ae48-kube-api-access-d62xk" (OuterVolumeSpecName: "kube-api-access-d62xk") pod "b309016b-5997-4306-a687-9f8e0342ae48" (UID: "b309016b-5997-4306-a687-9f8e0342ae48"). InnerVolumeSpecName "kube-api-access-d62xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.503698 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b309016b-5997-4306-a687-9f8e0342ae48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b309016b-5997-4306-a687-9f8e0342ae48" (UID: "b309016b-5997-4306-a687-9f8e0342ae48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.557782 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d62xk\" (UniqueName: \"kubernetes.io/projected/b309016b-5997-4306-a687-9f8e0342ae48-kube-api-access-d62xk\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.558079 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b309016b-5997-4306-a687-9f8e0342ae48-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.558210 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b309016b-5997-4306-a687-9f8e0342ae48-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.871145 4867 generic.go:334] "Generic (PLEG): container finished" podID="b309016b-5997-4306-a687-9f8e0342ae48" containerID="2ac286d34d4ddb3a4a43b57bc4846ac9571422d78bc5091d457fb475351334e5" exitCode=0 Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.871205 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7gn7" event={"ID":"b309016b-5997-4306-a687-9f8e0342ae48","Type":"ContainerDied","Data":"2ac286d34d4ddb3a4a43b57bc4846ac9571422d78bc5091d457fb475351334e5"} Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.871224 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7gn7" Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.871244 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7gn7" event={"ID":"b309016b-5997-4306-a687-9f8e0342ae48","Type":"ContainerDied","Data":"579ee10f48309c67ebad062e8e74bf52c8e117c592371c201a6b405d39c9f116"} Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.871894 4867 scope.go:117] "RemoveContainer" containerID="2ac286d34d4ddb3a4a43b57bc4846ac9571422d78bc5091d457fb475351334e5" Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.903962 4867 scope.go:117] "RemoveContainer" containerID="fe434abbe91f818df3d112177502c2f9af211e231b593d40dcbcbb5c6b2ba311" Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.909644 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f7gn7"] Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.926773 4867 scope.go:117] "RemoveContainer" containerID="393841ab1c41d150887561b729bd31677dd0f2e38746d7aeacfe2cb4c6af7f31" Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.931675 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f7gn7"] Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.975744 4867 scope.go:117] "RemoveContainer" containerID="2ac286d34d4ddb3a4a43b57bc4846ac9571422d78bc5091d457fb475351334e5" Dec 01 10:37:06 crc kubenswrapper[4867]: E1201 10:37:06.976358 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ac286d34d4ddb3a4a43b57bc4846ac9571422d78bc5091d457fb475351334e5\": container with ID starting with 2ac286d34d4ddb3a4a43b57bc4846ac9571422d78bc5091d457fb475351334e5 not found: ID does not exist" containerID="2ac286d34d4ddb3a4a43b57bc4846ac9571422d78bc5091d457fb475351334e5" Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.976400 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ac286d34d4ddb3a4a43b57bc4846ac9571422d78bc5091d457fb475351334e5"} err="failed to get container status \"2ac286d34d4ddb3a4a43b57bc4846ac9571422d78bc5091d457fb475351334e5\": rpc error: code = NotFound desc = could not find container \"2ac286d34d4ddb3a4a43b57bc4846ac9571422d78bc5091d457fb475351334e5\": container with ID starting with 2ac286d34d4ddb3a4a43b57bc4846ac9571422d78bc5091d457fb475351334e5 not found: ID does not exist" Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.976432 4867 scope.go:117] "RemoveContainer" containerID="fe434abbe91f818df3d112177502c2f9af211e231b593d40dcbcbb5c6b2ba311" Dec 01 10:37:06 crc kubenswrapper[4867]: E1201 10:37:06.976841 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe434abbe91f818df3d112177502c2f9af211e231b593d40dcbcbb5c6b2ba311\": container with ID starting with fe434abbe91f818df3d112177502c2f9af211e231b593d40dcbcbb5c6b2ba311 not found: ID does not exist" containerID="fe434abbe91f818df3d112177502c2f9af211e231b593d40dcbcbb5c6b2ba311" Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.976864 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe434abbe91f818df3d112177502c2f9af211e231b593d40dcbcbb5c6b2ba311"} err="failed to get container status \"fe434abbe91f818df3d112177502c2f9af211e231b593d40dcbcbb5c6b2ba311\": rpc error: code = NotFound desc = could not find container \"fe434abbe91f818df3d112177502c2f9af211e231b593d40dcbcbb5c6b2ba311\": container with ID starting with fe434abbe91f818df3d112177502c2f9af211e231b593d40dcbcbb5c6b2ba311 not found: ID does not exist" Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.976881 4867 scope.go:117] "RemoveContainer" containerID="393841ab1c41d150887561b729bd31677dd0f2e38746d7aeacfe2cb4c6af7f31" Dec 01 10:37:06 crc kubenswrapper[4867]: E1201 10:37:06.977149 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"393841ab1c41d150887561b729bd31677dd0f2e38746d7aeacfe2cb4c6af7f31\": container with ID starting with 393841ab1c41d150887561b729bd31677dd0f2e38746d7aeacfe2cb4c6af7f31 not found: ID does not exist" containerID="393841ab1c41d150887561b729bd31677dd0f2e38746d7aeacfe2cb4c6af7f31" Dec 01 10:37:06 crc kubenswrapper[4867]: I1201 10:37:06.977207 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"393841ab1c41d150887561b729bd31677dd0f2e38746d7aeacfe2cb4c6af7f31"} err="failed to get container status \"393841ab1c41d150887561b729bd31677dd0f2e38746d7aeacfe2cb4c6af7f31\": rpc error: code = NotFound desc = could not find container \"393841ab1c41d150887561b729bd31677dd0f2e38746d7aeacfe2cb4c6af7f31\": container with ID starting with 393841ab1c41d150887561b729bd31677dd0f2e38746d7aeacfe2cb4c6af7f31 not found: ID does not exist" Dec 01 10:37:08 crc kubenswrapper[4867]: I1201 10:37:08.836563 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b309016b-5997-4306-a687-9f8e0342ae48" path="/var/lib/kubelet/pods/b309016b-5997-4306-a687-9f8e0342ae48/volumes" Dec 01 10:37:31 crc kubenswrapper[4867]: I1201 10:37:31.143896 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t8fj7"] Dec 01 10:37:31 crc kubenswrapper[4867]: E1201 10:37:31.144804 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b309016b-5997-4306-a687-9f8e0342ae48" containerName="registry-server" Dec 01 10:37:31 crc kubenswrapper[4867]: I1201 10:37:31.144836 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b309016b-5997-4306-a687-9f8e0342ae48" containerName="registry-server" Dec 01 10:37:31 crc kubenswrapper[4867]: E1201 10:37:31.144853 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b309016b-5997-4306-a687-9f8e0342ae48" containerName="extract-utilities" Dec 01 10:37:31 crc kubenswrapper[4867]: I1201 10:37:31.144859 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b309016b-5997-4306-a687-9f8e0342ae48" containerName="extract-utilities" Dec 01 10:37:31 crc kubenswrapper[4867]: E1201 10:37:31.144872 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b309016b-5997-4306-a687-9f8e0342ae48" containerName="extract-content" Dec 01 10:37:31 crc kubenswrapper[4867]: I1201 10:37:31.144878 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b309016b-5997-4306-a687-9f8e0342ae48" containerName="extract-content" Dec 01 10:37:31 crc kubenswrapper[4867]: I1201 10:37:31.145041 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b309016b-5997-4306-a687-9f8e0342ae48" containerName="registry-server" Dec 01 10:37:31 crc kubenswrapper[4867]: I1201 10:37:31.146461 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t8fj7" Dec 01 10:37:31 crc kubenswrapper[4867]: I1201 10:37:31.164758 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t8fj7"] Dec 01 10:37:31 crc kubenswrapper[4867]: I1201 10:37:31.295475 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8btw\" (UniqueName: \"kubernetes.io/projected/7275a285-256c-48dd-b0d6-b80fc37603b6-kube-api-access-z8btw\") pod \"redhat-operators-t8fj7\" (UID: \"7275a285-256c-48dd-b0d6-b80fc37603b6\") " pod="openshift-marketplace/redhat-operators-t8fj7" Dec 01 10:37:31 crc kubenswrapper[4867]: I1201 10:37:31.295546 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7275a285-256c-48dd-b0d6-b80fc37603b6-utilities\") pod \"redhat-operators-t8fj7\" (UID: \"7275a285-256c-48dd-b0d6-b80fc37603b6\") " pod="openshift-marketplace/redhat-operators-t8fj7" Dec 01 10:37:31 crc kubenswrapper[4867]: I1201 10:37:31.295914 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7275a285-256c-48dd-b0d6-b80fc37603b6-catalog-content\") pod \"redhat-operators-t8fj7\" (UID: \"7275a285-256c-48dd-b0d6-b80fc37603b6\") " pod="openshift-marketplace/redhat-operators-t8fj7" Dec 01 10:37:31 crc kubenswrapper[4867]: I1201 10:37:31.397834 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7275a285-256c-48dd-b0d6-b80fc37603b6-catalog-content\") pod \"redhat-operators-t8fj7\" (UID: \"7275a285-256c-48dd-b0d6-b80fc37603b6\") " pod="openshift-marketplace/redhat-operators-t8fj7" Dec 01 10:37:31 crc kubenswrapper[4867]: I1201 10:37:31.397975 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8btw\" (UniqueName: \"kubernetes.io/projected/7275a285-256c-48dd-b0d6-b80fc37603b6-kube-api-access-z8btw\") pod \"redhat-operators-t8fj7\" (UID: \"7275a285-256c-48dd-b0d6-b80fc37603b6\") " pod="openshift-marketplace/redhat-operators-t8fj7" Dec 01 10:37:31 crc kubenswrapper[4867]: I1201 10:37:31.398036 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7275a285-256c-48dd-b0d6-b80fc37603b6-utilities\") pod \"redhat-operators-t8fj7\" (UID: \"7275a285-256c-48dd-b0d6-b80fc37603b6\") " pod="openshift-marketplace/redhat-operators-t8fj7" Dec 01 10:37:31 crc kubenswrapper[4867]: I1201 10:37:31.398562 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7275a285-256c-48dd-b0d6-b80fc37603b6-utilities\") pod \"redhat-operators-t8fj7\" (UID: \"7275a285-256c-48dd-b0d6-b80fc37603b6\") " pod="openshift-marketplace/redhat-operators-t8fj7" Dec 01 10:37:31 crc kubenswrapper[4867]: I1201 10:37:31.398591 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7275a285-256c-48dd-b0d6-b80fc37603b6-catalog-content\") pod \"redhat-operators-t8fj7\" (UID: \"7275a285-256c-48dd-b0d6-b80fc37603b6\") " pod="openshift-marketplace/redhat-operators-t8fj7" Dec 01 10:37:31 crc kubenswrapper[4867]: I1201 10:37:31.417200 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8btw\" (UniqueName: \"kubernetes.io/projected/7275a285-256c-48dd-b0d6-b80fc37603b6-kube-api-access-z8btw\") pod \"redhat-operators-t8fj7\" (UID: \"7275a285-256c-48dd-b0d6-b80fc37603b6\") " pod="openshift-marketplace/redhat-operators-t8fj7" Dec 01 10:37:31 crc kubenswrapper[4867]: I1201 10:37:31.501660 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t8fj7" Dec 01 10:37:31 crc kubenswrapper[4867]: I1201 10:37:31.965302 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t8fj7"] Dec 01 10:37:32 crc kubenswrapper[4867]: I1201 10:37:32.124954 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t8fj7" event={"ID":"7275a285-256c-48dd-b0d6-b80fc37603b6","Type":"ContainerStarted","Data":"e3fbb86a8bd94c6a9a938ecdc5d298e0b8ab049228824013a1db7eaa73cd8393"} Dec 01 10:37:33 crc kubenswrapper[4867]: I1201 10:37:33.138351 4867 generic.go:334] "Generic (PLEG): container finished" podID="7275a285-256c-48dd-b0d6-b80fc37603b6" containerID="45e07ee34fcb9ce9a883d0aad4c9cb1ee3b9be9cdfd4203064efc052cc5b598a" exitCode=0 Dec 01 10:37:33 crc kubenswrapper[4867]: I1201 10:37:33.138415 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t8fj7" event={"ID":"7275a285-256c-48dd-b0d6-b80fc37603b6","Type":"ContainerDied","Data":"45e07ee34fcb9ce9a883d0aad4c9cb1ee3b9be9cdfd4203064efc052cc5b598a"} Dec 01 10:37:34 crc kubenswrapper[4867]: I1201 10:37:34.657228 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9h2dq/must-gather-tl5wz"] Dec 01 10:37:34 crc kubenswrapper[4867]: I1201 10:37:34.662111 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9h2dq/must-gather-tl5wz" Dec 01 10:37:34 crc kubenswrapper[4867]: I1201 10:37:34.670645 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9h2dq/must-gather-tl5wz"] Dec 01 10:37:34 crc kubenswrapper[4867]: I1201 10:37:34.672166 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9h2dq"/"openshift-service-ca.crt" Dec 01 10:37:34 crc kubenswrapper[4867]: I1201 10:37:34.672168 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9h2dq"/"kube-root-ca.crt" Dec 01 10:37:34 crc kubenswrapper[4867]: I1201 10:37:34.777932 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gbkt\" (UniqueName: \"kubernetes.io/projected/ddabec67-3daf-413f-9c08-fc02e60e9b67-kube-api-access-5gbkt\") pod \"must-gather-tl5wz\" (UID: \"ddabec67-3daf-413f-9c08-fc02e60e9b67\") " pod="openshift-must-gather-9h2dq/must-gather-tl5wz" Dec 01 10:37:34 crc kubenswrapper[4867]: I1201 10:37:34.778257 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ddabec67-3daf-413f-9c08-fc02e60e9b67-must-gather-output\") pod \"must-gather-tl5wz\" (UID: \"ddabec67-3daf-413f-9c08-fc02e60e9b67\") " pod="openshift-must-gather-9h2dq/must-gather-tl5wz" Dec 01 10:37:34 crc kubenswrapper[4867]: I1201 10:37:34.880210 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gbkt\" (UniqueName: \"kubernetes.io/projected/ddabec67-3daf-413f-9c08-fc02e60e9b67-kube-api-access-5gbkt\") pod \"must-gather-tl5wz\" (UID: \"ddabec67-3daf-413f-9c08-fc02e60e9b67\") " pod="openshift-must-gather-9h2dq/must-gather-tl5wz" Dec 01 10:37:34 crc kubenswrapper[4867]: I1201 10:37:34.880262 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ddabec67-3daf-413f-9c08-fc02e60e9b67-must-gather-output\") pod \"must-gather-tl5wz\" (UID: \"ddabec67-3daf-413f-9c08-fc02e60e9b67\") " pod="openshift-must-gather-9h2dq/must-gather-tl5wz" Dec 01 10:37:34 crc kubenswrapper[4867]: I1201 10:37:34.880671 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ddabec67-3daf-413f-9c08-fc02e60e9b67-must-gather-output\") pod \"must-gather-tl5wz\" (UID: \"ddabec67-3daf-413f-9c08-fc02e60e9b67\") " pod="openshift-must-gather-9h2dq/must-gather-tl5wz" Dec 01 10:37:34 crc kubenswrapper[4867]: I1201 10:37:34.907555 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gbkt\" (UniqueName: \"kubernetes.io/projected/ddabec67-3daf-413f-9c08-fc02e60e9b67-kube-api-access-5gbkt\") pod \"must-gather-tl5wz\" (UID: \"ddabec67-3daf-413f-9c08-fc02e60e9b67\") " pod="openshift-must-gather-9h2dq/must-gather-tl5wz" Dec 01 10:37:34 crc kubenswrapper[4867]: I1201 10:37:34.993179 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9h2dq/must-gather-tl5wz" Dec 01 10:37:35 crc kubenswrapper[4867]: I1201 10:37:35.533576 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9h2dq/must-gather-tl5wz"] Dec 01 10:37:36 crc kubenswrapper[4867]: I1201 10:37:36.192150 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9h2dq/must-gather-tl5wz" event={"ID":"ddabec67-3daf-413f-9c08-fc02e60e9b67","Type":"ContainerStarted","Data":"1616903806d2ea3ad7106f47d4102a5445dcf02f2485c1a13cad91c1d7b65740"} Dec 01 10:37:36 crc kubenswrapper[4867]: I1201 10:37:36.192546 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9h2dq/must-gather-tl5wz" event={"ID":"ddabec67-3daf-413f-9c08-fc02e60e9b67","Type":"ContainerStarted","Data":"edd849514fcd4d0f7c7eaa3a9de642c2773ef1e6e900405a358f089efbfb1fd4"} Dec 01 10:37:36 crc kubenswrapper[4867]: I1201 10:37:36.192566 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9h2dq/must-gather-tl5wz" event={"ID":"ddabec67-3daf-413f-9c08-fc02e60e9b67","Type":"ContainerStarted","Data":"8e4071c4e1c4f9231d83ea0c1a74458eaa5fb5bd47ea894308c44cf4358c980b"} Dec 01 10:37:36 crc kubenswrapper[4867]: I1201 10:37:36.212045 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9h2dq/must-gather-tl5wz" podStartSLOduration=2.212028547 podStartE2EDuration="2.212028547s" podCreationTimestamp="2025-12-01 10:37:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:37:36.207641327 +0000 UTC m=+5377.667028081" watchObservedRunningTime="2025-12-01 10:37:36.212028547 +0000 UTC m=+5377.671415301" Dec 01 10:37:36 crc kubenswrapper[4867]: I1201 10:37:36.238523 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8rrx6"] Dec 01 10:37:36 crc kubenswrapper[4867]: I1201 10:37:36.242805 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8rrx6" Dec 01 10:37:36 crc kubenswrapper[4867]: I1201 10:37:36.269578 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8rrx6"] Dec 01 10:37:36 crc kubenswrapper[4867]: I1201 10:37:36.318941 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61a381b2-9bd6-4163-8d9a-a9059d3dedd9-utilities\") pod \"redhat-marketplace-8rrx6\" (UID: \"61a381b2-9bd6-4163-8d9a-a9059d3dedd9\") " pod="openshift-marketplace/redhat-marketplace-8rrx6" Dec 01 10:37:36 crc kubenswrapper[4867]: I1201 10:37:36.319013 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61a381b2-9bd6-4163-8d9a-a9059d3dedd9-catalog-content\") pod \"redhat-marketplace-8rrx6\" (UID: \"61a381b2-9bd6-4163-8d9a-a9059d3dedd9\") " pod="openshift-marketplace/redhat-marketplace-8rrx6" Dec 01 10:37:36 crc kubenswrapper[4867]: I1201 10:37:36.319040 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfbrn\" (UniqueName: \"kubernetes.io/projected/61a381b2-9bd6-4163-8d9a-a9059d3dedd9-kube-api-access-nfbrn\") pod \"redhat-marketplace-8rrx6\" (UID: \"61a381b2-9bd6-4163-8d9a-a9059d3dedd9\") " pod="openshift-marketplace/redhat-marketplace-8rrx6" Dec 01 10:37:36 crc kubenswrapper[4867]: I1201 10:37:36.421261 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61a381b2-9bd6-4163-8d9a-a9059d3dedd9-catalog-content\") pod \"redhat-marketplace-8rrx6\" (UID: \"61a381b2-9bd6-4163-8d9a-a9059d3dedd9\") " pod="openshift-marketplace/redhat-marketplace-8rrx6" Dec 01 10:37:36 crc kubenswrapper[4867]: I1201 10:37:36.421308 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfbrn\" (UniqueName: \"kubernetes.io/projected/61a381b2-9bd6-4163-8d9a-a9059d3dedd9-kube-api-access-nfbrn\") pod \"redhat-marketplace-8rrx6\" (UID: \"61a381b2-9bd6-4163-8d9a-a9059d3dedd9\") " pod="openshift-marketplace/redhat-marketplace-8rrx6" Dec 01 10:37:36 crc kubenswrapper[4867]: I1201 10:37:36.421453 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61a381b2-9bd6-4163-8d9a-a9059d3dedd9-utilities\") pod \"redhat-marketplace-8rrx6\" (UID: \"61a381b2-9bd6-4163-8d9a-a9059d3dedd9\") " pod="openshift-marketplace/redhat-marketplace-8rrx6" Dec 01 10:37:36 crc kubenswrapper[4867]: I1201 10:37:36.421984 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61a381b2-9bd6-4163-8d9a-a9059d3dedd9-catalog-content\") pod \"redhat-marketplace-8rrx6\" (UID: \"61a381b2-9bd6-4163-8d9a-a9059d3dedd9\") " pod="openshift-marketplace/redhat-marketplace-8rrx6" Dec 01 10:37:36 crc kubenswrapper[4867]: I1201 10:37:36.422046 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61a381b2-9bd6-4163-8d9a-a9059d3dedd9-utilities\") pod \"redhat-marketplace-8rrx6\" (UID: \"61a381b2-9bd6-4163-8d9a-a9059d3dedd9\") " pod="openshift-marketplace/redhat-marketplace-8rrx6" Dec 01 10:37:36 crc kubenswrapper[4867]: I1201 10:37:36.449640 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfbrn\" (UniqueName: \"kubernetes.io/projected/61a381b2-9bd6-4163-8d9a-a9059d3dedd9-kube-api-access-nfbrn\") pod \"redhat-marketplace-8rrx6\" (UID: \"61a381b2-9bd6-4163-8d9a-a9059d3dedd9\") " pod="openshift-marketplace/redhat-marketplace-8rrx6" Dec 01 10:37:36 crc kubenswrapper[4867]: I1201 10:37:36.579633 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8rrx6" Dec 01 10:37:37 crc kubenswrapper[4867]: I1201 10:37:37.172184 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8rrx6"] Dec 01 10:37:37 crc kubenswrapper[4867]: I1201 10:37:37.204561 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rrx6" event={"ID":"61a381b2-9bd6-4163-8d9a-a9059d3dedd9","Type":"ContainerStarted","Data":"67e0f0cad848b83cd4c43abe37bdb06541a7dcb7867be7801d554d41c4c11df3"} Dec 01 10:37:38 crc kubenswrapper[4867]: I1201 10:37:38.236987 4867 generic.go:334] "Generic (PLEG): container finished" podID="61a381b2-9bd6-4163-8d9a-a9059d3dedd9" containerID="65e19cc2c5a47a351757e31c3a27b4a9fd03079581cd41bfbaae45d8de15392a" exitCode=0 Dec 01 10:37:38 crc kubenswrapper[4867]: I1201 10:37:38.237213 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rrx6" event={"ID":"61a381b2-9bd6-4163-8d9a-a9059d3dedd9","Type":"ContainerDied","Data":"65e19cc2c5a47a351757e31c3a27b4a9fd03079581cd41bfbaae45d8de15392a"} Dec 01 10:37:39 crc kubenswrapper[4867]: I1201 10:37:39.255904 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rrx6" event={"ID":"61a381b2-9bd6-4163-8d9a-a9059d3dedd9","Type":"ContainerStarted","Data":"005baae2ae9814cbe8cd38bc82ed9a4acb98855a4d71da79e11cf98c54aa609b"} Dec 01 10:37:40 crc kubenswrapper[4867]: I1201 10:37:40.266639 4867 generic.go:334] "Generic (PLEG): container finished" podID="61a381b2-9bd6-4163-8d9a-a9059d3dedd9" containerID="005baae2ae9814cbe8cd38bc82ed9a4acb98855a4d71da79e11cf98c54aa609b" exitCode=0 Dec 01 10:37:40 crc kubenswrapper[4867]: I1201 10:37:40.266897 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rrx6" event={"ID":"61a381b2-9bd6-4163-8d9a-a9059d3dedd9","Type":"ContainerDied","Data":"005baae2ae9814cbe8cd38bc82ed9a4acb98855a4d71da79e11cf98c54aa609b"} Dec 01 10:37:40 crc kubenswrapper[4867]: I1201 10:37:40.306585 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9h2dq/crc-debug-qztw7"] Dec 01 10:37:40 crc kubenswrapper[4867]: I1201 10:37:40.307946 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9h2dq/crc-debug-qztw7" Dec 01 10:37:40 crc kubenswrapper[4867]: I1201 10:37:40.312468 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9h2dq"/"default-dockercfg-fb4gp" Dec 01 10:37:40 crc kubenswrapper[4867]: I1201 10:37:40.398951 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d664552c-f492-4231-b7f8-a58ccf57c6ee-host\") pod \"crc-debug-qztw7\" (UID: \"d664552c-f492-4231-b7f8-a58ccf57c6ee\") " pod="openshift-must-gather-9h2dq/crc-debug-qztw7" Dec 01 10:37:40 crc kubenswrapper[4867]: I1201 10:37:40.398998 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gcrz\" (UniqueName: \"kubernetes.io/projected/d664552c-f492-4231-b7f8-a58ccf57c6ee-kube-api-access-6gcrz\") pod \"crc-debug-qztw7\" (UID: \"d664552c-f492-4231-b7f8-a58ccf57c6ee\") " pod="openshift-must-gather-9h2dq/crc-debug-qztw7" Dec 01 10:37:40 crc kubenswrapper[4867]: I1201 10:37:40.500467 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d664552c-f492-4231-b7f8-a58ccf57c6ee-host\") pod \"crc-debug-qztw7\" (UID: \"d664552c-f492-4231-b7f8-a58ccf57c6ee\") " pod="openshift-must-gather-9h2dq/crc-debug-qztw7" Dec 01 10:37:40 crc kubenswrapper[4867]: I1201 10:37:40.500532 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gcrz\" (UniqueName: \"kubernetes.io/projected/d664552c-f492-4231-b7f8-a58ccf57c6ee-kube-api-access-6gcrz\") pod \"crc-debug-qztw7\" (UID: \"d664552c-f492-4231-b7f8-a58ccf57c6ee\") " pod="openshift-must-gather-9h2dq/crc-debug-qztw7" Dec 01 10:37:40 crc kubenswrapper[4867]: I1201 10:37:40.500658 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d664552c-f492-4231-b7f8-a58ccf57c6ee-host\") pod \"crc-debug-qztw7\" (UID: \"d664552c-f492-4231-b7f8-a58ccf57c6ee\") " pod="openshift-must-gather-9h2dq/crc-debug-qztw7" Dec 01 10:37:40 crc kubenswrapper[4867]: I1201 10:37:40.532931 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gcrz\" (UniqueName: \"kubernetes.io/projected/d664552c-f492-4231-b7f8-a58ccf57c6ee-kube-api-access-6gcrz\") pod \"crc-debug-qztw7\" (UID: \"d664552c-f492-4231-b7f8-a58ccf57c6ee\") " pod="openshift-must-gather-9h2dq/crc-debug-qztw7" Dec 01 10:37:40 crc kubenswrapper[4867]: I1201 10:37:40.628722 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9h2dq/crc-debug-qztw7" Dec 01 10:37:46 crc kubenswrapper[4867]: I1201 10:37:46.351932 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rrx6" event={"ID":"61a381b2-9bd6-4163-8d9a-a9059d3dedd9","Type":"ContainerStarted","Data":"ee61caf4fe403bd836f1bc5977e9c36e92f228a0feb43224e4baf4fbeda835dc"} Dec 01 10:37:46 crc kubenswrapper[4867]: I1201 10:37:46.354004 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9h2dq/crc-debug-qztw7" event={"ID":"d664552c-f492-4231-b7f8-a58ccf57c6ee","Type":"ContainerStarted","Data":"fb51a3e0ebb97c1f98d3feea3604e333b9ac6595d207a1bfbb256e5a8f855a7d"} Dec 01 10:37:46 crc kubenswrapper[4867]: I1201 10:37:46.354059 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9h2dq/crc-debug-qztw7" event={"ID":"d664552c-f492-4231-b7f8-a58ccf57c6ee","Type":"ContainerStarted","Data":"594fc27815306d250c4ee95c4c3e3f15edbb66ffa5ab3d2bff95012ae99e1839"} Dec 01 10:37:46 crc kubenswrapper[4867]: I1201 10:37:46.355331 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t8fj7" event={"ID":"7275a285-256c-48dd-b0d6-b80fc37603b6","Type":"ContainerStarted","Data":"3195c2a992f78cff53bb7fa61ee63c57fbff4be4d61f750b2a6f86d8a9cbd285"} Dec 01 10:37:46 crc kubenswrapper[4867]: I1201 10:37:46.373620 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9h2dq/crc-debug-qztw7" podStartSLOduration=6.373599969 podStartE2EDuration="6.373599969s" podCreationTimestamp="2025-12-01 10:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:37:46.371768838 +0000 UTC m=+5387.831155602" watchObservedRunningTime="2025-12-01 10:37:46.373599969 +0000 UTC m=+5387.832986733" Dec 01 10:37:49 crc kubenswrapper[4867]: I1201 10:37:49.382312 4867 generic.go:334] "Generic (PLEG): container finished" podID="7275a285-256c-48dd-b0d6-b80fc37603b6" containerID="3195c2a992f78cff53bb7fa61ee63c57fbff4be4d61f750b2a6f86d8a9cbd285" exitCode=0 Dec 01 10:37:49 crc kubenswrapper[4867]: I1201 10:37:49.382401 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t8fj7" event={"ID":"7275a285-256c-48dd-b0d6-b80fc37603b6","Type":"ContainerDied","Data":"3195c2a992f78cff53bb7fa61ee63c57fbff4be4d61f750b2a6f86d8a9cbd285"} Dec 01 10:37:49 crc kubenswrapper[4867]: I1201 10:37:49.411487 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8rrx6" podStartSLOduration=5.808234244 podStartE2EDuration="13.411472226s" podCreationTimestamp="2025-12-01 10:37:36 +0000 UTC" firstStartedPulling="2025-12-01 10:37:38.239452921 +0000 UTC m=+5379.698839675" lastFinishedPulling="2025-12-01 10:37:45.842690903 +0000 UTC m=+5387.302077657" observedRunningTime="2025-12-01 10:37:47.38836427 +0000 UTC m=+5388.847751014" watchObservedRunningTime="2025-12-01 10:37:49.411472226 +0000 UTC m=+5390.870858980" Dec 01 10:37:51 crc kubenswrapper[4867]: I1201 10:37:51.407775 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t8fj7" event={"ID":"7275a285-256c-48dd-b0d6-b80fc37603b6","Type":"ContainerStarted","Data":"70d33aaf5a60f9bf7d24e26de6e51a2059feef23a229ecf460f3b43b190effbd"} Dec 01 10:37:51 crc kubenswrapper[4867]: I1201 10:37:51.432850 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t8fj7" podStartSLOduration=3.20382825 podStartE2EDuration="20.432827254s" podCreationTimestamp="2025-12-01 10:37:31 +0000 UTC" firstStartedPulling="2025-12-01 10:37:33.14242034 +0000 UTC m=+5374.601807114" lastFinishedPulling="2025-12-01 10:37:50.371419364 +0000 UTC m=+5391.830806118" observedRunningTime="2025-12-01 10:37:51.425080381 +0000 UTC m=+5392.884467135" watchObservedRunningTime="2025-12-01 10:37:51.432827254 +0000 UTC m=+5392.892214008" Dec 01 10:37:51 crc kubenswrapper[4867]: I1201 10:37:51.502857 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t8fj7" Dec 01 10:37:51 crc kubenswrapper[4867]: I1201 10:37:51.502908 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t8fj7" Dec 01 10:37:52 crc kubenswrapper[4867]: I1201 10:37:52.551218 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t8fj7" podUID="7275a285-256c-48dd-b0d6-b80fc37603b6" containerName="registry-server" probeResult="failure" output=< Dec 01 10:37:52 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Dec 01 10:37:52 crc kubenswrapper[4867]: > Dec 01 10:37:56 crc kubenswrapper[4867]: I1201 10:37:56.581372 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8rrx6" Dec 01 10:37:56 crc kubenswrapper[4867]: I1201 10:37:56.581885 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8rrx6" Dec 01 10:37:56 crc kubenswrapper[4867]: I1201 10:37:56.625730 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8rrx6" Dec 01 10:37:57 crc kubenswrapper[4867]: I1201 10:37:57.521606 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8rrx6" Dec 01 10:37:57 crc kubenswrapper[4867]: I1201 10:37:57.577159 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8rrx6"] Dec 01 10:37:59 crc kubenswrapper[4867]: I1201 10:37:59.491641 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8rrx6" podUID="61a381b2-9bd6-4163-8d9a-a9059d3dedd9" containerName="registry-server" containerID="cri-o://ee61caf4fe403bd836f1bc5977e9c36e92f228a0feb43224e4baf4fbeda835dc" gracePeriod=2 Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.029060 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8rrx6" Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.173488 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61a381b2-9bd6-4163-8d9a-a9059d3dedd9-utilities\") pod \"61a381b2-9bd6-4163-8d9a-a9059d3dedd9\" (UID: \"61a381b2-9bd6-4163-8d9a-a9059d3dedd9\") " Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.173709 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61a381b2-9bd6-4163-8d9a-a9059d3dedd9-catalog-content\") pod \"61a381b2-9bd6-4163-8d9a-a9059d3dedd9\" (UID: \"61a381b2-9bd6-4163-8d9a-a9059d3dedd9\") " Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.173935 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfbrn\" (UniqueName: \"kubernetes.io/projected/61a381b2-9bd6-4163-8d9a-a9059d3dedd9-kube-api-access-nfbrn\") pod \"61a381b2-9bd6-4163-8d9a-a9059d3dedd9\" (UID: \"61a381b2-9bd6-4163-8d9a-a9059d3dedd9\") " Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.174247 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61a381b2-9bd6-4163-8d9a-a9059d3dedd9-utilities" (OuterVolumeSpecName: "utilities") pod "61a381b2-9bd6-4163-8d9a-a9059d3dedd9" (UID: "61a381b2-9bd6-4163-8d9a-a9059d3dedd9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.175718 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61a381b2-9bd6-4163-8d9a-a9059d3dedd9-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.192897 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61a381b2-9bd6-4163-8d9a-a9059d3dedd9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61a381b2-9bd6-4163-8d9a-a9059d3dedd9" (UID: "61a381b2-9bd6-4163-8d9a-a9059d3dedd9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.194627 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61a381b2-9bd6-4163-8d9a-a9059d3dedd9-kube-api-access-nfbrn" (OuterVolumeSpecName: "kube-api-access-nfbrn") pod "61a381b2-9bd6-4163-8d9a-a9059d3dedd9" (UID: "61a381b2-9bd6-4163-8d9a-a9059d3dedd9"). InnerVolumeSpecName "kube-api-access-nfbrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.277473 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61a381b2-9bd6-4163-8d9a-a9059d3dedd9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.277519 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfbrn\" (UniqueName: \"kubernetes.io/projected/61a381b2-9bd6-4163-8d9a-a9059d3dedd9-kube-api-access-nfbrn\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.504537 4867 generic.go:334] "Generic (PLEG): container finished" podID="61a381b2-9bd6-4163-8d9a-a9059d3dedd9" containerID="ee61caf4fe403bd836f1bc5977e9c36e92f228a0feb43224e4baf4fbeda835dc" exitCode=0 Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.504611 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8rrx6" Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.505803 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rrx6" event={"ID":"61a381b2-9bd6-4163-8d9a-a9059d3dedd9","Type":"ContainerDied","Data":"ee61caf4fe403bd836f1bc5977e9c36e92f228a0feb43224e4baf4fbeda835dc"} Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.505983 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rrx6" event={"ID":"61a381b2-9bd6-4163-8d9a-a9059d3dedd9","Type":"ContainerDied","Data":"67e0f0cad848b83cd4c43abe37bdb06541a7dcb7867be7801d554d41c4c11df3"} Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.506078 4867 scope.go:117] "RemoveContainer" containerID="ee61caf4fe403bd836f1bc5977e9c36e92f228a0feb43224e4baf4fbeda835dc" Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.543601 4867 scope.go:117] "RemoveContainer" containerID="005baae2ae9814cbe8cd38bc82ed9a4acb98855a4d71da79e11cf98c54aa609b" Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.551633 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8rrx6"] Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.557847 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8rrx6"] Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.574360 4867 scope.go:117] "RemoveContainer" containerID="65e19cc2c5a47a351757e31c3a27b4a9fd03079581cd41bfbaae45d8de15392a" Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.632951 4867 scope.go:117] "RemoveContainer" containerID="ee61caf4fe403bd836f1bc5977e9c36e92f228a0feb43224e4baf4fbeda835dc" Dec 01 10:38:00 crc kubenswrapper[4867]: E1201 10:38:00.633409 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee61caf4fe403bd836f1bc5977e9c36e92f228a0feb43224e4baf4fbeda835dc\": container with ID starting with ee61caf4fe403bd836f1bc5977e9c36e92f228a0feb43224e4baf4fbeda835dc not found: ID does not exist" containerID="ee61caf4fe403bd836f1bc5977e9c36e92f228a0feb43224e4baf4fbeda835dc" Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.633442 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee61caf4fe403bd836f1bc5977e9c36e92f228a0feb43224e4baf4fbeda835dc"} err="failed to get container status \"ee61caf4fe403bd836f1bc5977e9c36e92f228a0feb43224e4baf4fbeda835dc\": rpc error: code = NotFound desc = could not find container \"ee61caf4fe403bd836f1bc5977e9c36e92f228a0feb43224e4baf4fbeda835dc\": container with ID starting with ee61caf4fe403bd836f1bc5977e9c36e92f228a0feb43224e4baf4fbeda835dc not found: ID does not exist" Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.633465 4867 scope.go:117] "RemoveContainer" containerID="005baae2ae9814cbe8cd38bc82ed9a4acb98855a4d71da79e11cf98c54aa609b" Dec 01 10:38:00 crc kubenswrapper[4867]: E1201 10:38:00.634101 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"005baae2ae9814cbe8cd38bc82ed9a4acb98855a4d71da79e11cf98c54aa609b\": container with ID starting with 005baae2ae9814cbe8cd38bc82ed9a4acb98855a4d71da79e11cf98c54aa609b not found: ID does not exist" containerID="005baae2ae9814cbe8cd38bc82ed9a4acb98855a4d71da79e11cf98c54aa609b" Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.634145 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"005baae2ae9814cbe8cd38bc82ed9a4acb98855a4d71da79e11cf98c54aa609b"} err="failed to get container status \"005baae2ae9814cbe8cd38bc82ed9a4acb98855a4d71da79e11cf98c54aa609b\": rpc error: code = NotFound desc = could not find container \"005baae2ae9814cbe8cd38bc82ed9a4acb98855a4d71da79e11cf98c54aa609b\": container with ID starting with 005baae2ae9814cbe8cd38bc82ed9a4acb98855a4d71da79e11cf98c54aa609b not found: ID does not exist" Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.634173 4867 scope.go:117] "RemoveContainer" containerID="65e19cc2c5a47a351757e31c3a27b4a9fd03079581cd41bfbaae45d8de15392a" Dec 01 10:38:00 crc kubenswrapper[4867]: E1201 10:38:00.634565 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65e19cc2c5a47a351757e31c3a27b4a9fd03079581cd41bfbaae45d8de15392a\": container with ID starting with 65e19cc2c5a47a351757e31c3a27b4a9fd03079581cd41bfbaae45d8de15392a not found: ID does not exist" containerID="65e19cc2c5a47a351757e31c3a27b4a9fd03079581cd41bfbaae45d8de15392a" Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.634693 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65e19cc2c5a47a351757e31c3a27b4a9fd03079581cd41bfbaae45d8de15392a"} err="failed to get container status \"65e19cc2c5a47a351757e31c3a27b4a9fd03079581cd41bfbaae45d8de15392a\": rpc error: code = NotFound desc = could not find container \"65e19cc2c5a47a351757e31c3a27b4a9fd03079581cd41bfbaae45d8de15392a\": container with ID starting with 65e19cc2c5a47a351757e31c3a27b4a9fd03079581cd41bfbaae45d8de15392a not found: ID does not exist" Dec 01 10:38:00 crc kubenswrapper[4867]: I1201 10:38:00.838453 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61a381b2-9bd6-4163-8d9a-a9059d3dedd9" path="/var/lib/kubelet/pods/61a381b2-9bd6-4163-8d9a-a9059d3dedd9/volumes" Dec 01 10:38:01 crc kubenswrapper[4867]: I1201 10:38:01.549197 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t8fj7" Dec 01 10:38:01 crc kubenswrapper[4867]: I1201 10:38:01.602930 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t8fj7" Dec 01 10:38:02 crc kubenswrapper[4867]: I1201 10:38:02.094208 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t8fj7"] Dec 01 10:38:02 crc kubenswrapper[4867]: I1201 10:38:02.271504 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-54smd"] Dec 01 10:38:02 crc kubenswrapper[4867]: I1201 10:38:02.271734 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-54smd" podUID="a7c3f527-e2e1-4a92-b2dc-76cb294f84a6" containerName="registry-server" containerID="cri-o://62b683845159d532a0d256c67c241bd4126bfc6e3ea5aad26d618334dde3f684" gracePeriod=2 Dec 01 10:38:02 crc kubenswrapper[4867]: I1201 10:38:02.524896 4867 generic.go:334] "Generic (PLEG): container finished" podID="a7c3f527-e2e1-4a92-b2dc-76cb294f84a6" containerID="62b683845159d532a0d256c67c241bd4126bfc6e3ea5aad26d618334dde3f684" exitCode=0 Dec 01 10:38:02 crc kubenswrapper[4867]: I1201 10:38:02.525879 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54smd" event={"ID":"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6","Type":"ContainerDied","Data":"62b683845159d532a0d256c67c241bd4126bfc6e3ea5aad26d618334dde3f684"} Dec 01 10:38:02 crc kubenswrapper[4867]: I1201 10:38:02.885058 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-54smd" Dec 01 10:38:02 crc kubenswrapper[4867]: I1201 10:38:02.967752 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7c3f527-e2e1-4a92-b2dc-76cb294f84a6-catalog-content\") pod \"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6\" (UID: \"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6\") " Dec 01 10:38:02 crc kubenswrapper[4867]: I1201 10:38:02.967895 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7c3f527-e2e1-4a92-b2dc-76cb294f84a6-utilities\") pod \"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6\" (UID: \"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6\") " Dec 01 10:38:02 crc kubenswrapper[4867]: I1201 10:38:02.967933 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctn6d\" (UniqueName: \"kubernetes.io/projected/a7c3f527-e2e1-4a92-b2dc-76cb294f84a6-kube-api-access-ctn6d\") pod \"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6\" (UID: \"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6\") " Dec 01 10:38:02 crc kubenswrapper[4867]: I1201 10:38:02.968761 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7c3f527-e2e1-4a92-b2dc-76cb294f84a6-utilities" (OuterVolumeSpecName: "utilities") pod "a7c3f527-e2e1-4a92-b2dc-76cb294f84a6" (UID: "a7c3f527-e2e1-4a92-b2dc-76cb294f84a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:38:02 crc kubenswrapper[4867]: I1201 10:38:02.995480 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7c3f527-e2e1-4a92-b2dc-76cb294f84a6-kube-api-access-ctn6d" (OuterVolumeSpecName: "kube-api-access-ctn6d") pod "a7c3f527-e2e1-4a92-b2dc-76cb294f84a6" (UID: "a7c3f527-e2e1-4a92-b2dc-76cb294f84a6"). InnerVolumeSpecName "kube-api-access-ctn6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:38:03 crc kubenswrapper[4867]: I1201 10:38:03.070454 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7c3f527-e2e1-4a92-b2dc-76cb294f84a6-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:03 crc kubenswrapper[4867]: I1201 10:38:03.070485 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctn6d\" (UniqueName: \"kubernetes.io/projected/a7c3f527-e2e1-4a92-b2dc-76cb294f84a6-kube-api-access-ctn6d\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:03 crc kubenswrapper[4867]: I1201 10:38:03.107797 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7c3f527-e2e1-4a92-b2dc-76cb294f84a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7c3f527-e2e1-4a92-b2dc-76cb294f84a6" (UID: "a7c3f527-e2e1-4a92-b2dc-76cb294f84a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:38:03 crc kubenswrapper[4867]: I1201 10:38:03.197092 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7c3f527-e2e1-4a92-b2dc-76cb294f84a6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:03 crc kubenswrapper[4867]: I1201 10:38:03.535160 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54smd" event={"ID":"a7c3f527-e2e1-4a92-b2dc-76cb294f84a6","Type":"ContainerDied","Data":"552aafbc1e11d24c7f7a64296c2adb3c6e6477bb27010ea96992c890bafd94ab"} Dec 01 10:38:03 crc kubenswrapper[4867]: I1201 10:38:03.535198 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-54smd" Dec 01 10:38:03 crc kubenswrapper[4867]: I1201 10:38:03.535223 4867 scope.go:117] "RemoveContainer" containerID="62b683845159d532a0d256c67c241bd4126bfc6e3ea5aad26d618334dde3f684" Dec 01 10:38:03 crc kubenswrapper[4867]: I1201 10:38:03.561561 4867 scope.go:117] "RemoveContainer" containerID="98956a490afdca9304dc98ee31dad6bcf1ece77a58ab61eb7b36bc0be316f37b" Dec 01 10:38:03 crc kubenswrapper[4867]: I1201 10:38:03.573656 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-54smd"] Dec 01 10:38:03 crc kubenswrapper[4867]: I1201 10:38:03.582908 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-54smd"] Dec 01 10:38:03 crc kubenswrapper[4867]: I1201 10:38:03.604141 4867 scope.go:117] "RemoveContainer" containerID="2384f17427db5fb292830cfd1e00e30d3dd9916a8453c25be40d4172bec84758" Dec 01 10:38:04 crc kubenswrapper[4867]: I1201 10:38:04.838014 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7c3f527-e2e1-4a92-b2dc-76cb294f84a6" path="/var/lib/kubelet/pods/a7c3f527-e2e1-4a92-b2dc-76cb294f84a6/volumes" Dec 01 10:38:34 crc kubenswrapper[4867]: I1201 10:38:34.839458 4867 generic.go:334] "Generic (PLEG): container finished" podID="d664552c-f492-4231-b7f8-a58ccf57c6ee" containerID="fb51a3e0ebb97c1f98d3feea3604e333b9ac6595d207a1bfbb256e5a8f855a7d" exitCode=0 Dec 01 10:38:34 crc kubenswrapper[4867]: I1201 10:38:34.839506 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9h2dq/crc-debug-qztw7" event={"ID":"d664552c-f492-4231-b7f8-a58ccf57c6ee","Type":"ContainerDied","Data":"fb51a3e0ebb97c1f98d3feea3604e333b9ac6595d207a1bfbb256e5a8f855a7d"} Dec 01 10:38:35 crc kubenswrapper[4867]: I1201 10:38:35.967442 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9h2dq/crc-debug-qztw7" Dec 01 10:38:36 crc kubenswrapper[4867]: I1201 10:38:36.004151 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9h2dq/crc-debug-qztw7"] Dec 01 10:38:36 crc kubenswrapper[4867]: I1201 10:38:36.013610 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9h2dq/crc-debug-qztw7"] Dec 01 10:38:36 crc kubenswrapper[4867]: I1201 10:38:36.063177 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d664552c-f492-4231-b7f8-a58ccf57c6ee-host\") pod \"d664552c-f492-4231-b7f8-a58ccf57c6ee\" (UID: \"d664552c-f492-4231-b7f8-a58ccf57c6ee\") " Dec 01 10:38:36 crc kubenswrapper[4867]: I1201 10:38:36.063307 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gcrz\" (UniqueName: \"kubernetes.io/projected/d664552c-f492-4231-b7f8-a58ccf57c6ee-kube-api-access-6gcrz\") pod \"d664552c-f492-4231-b7f8-a58ccf57c6ee\" (UID: \"d664552c-f492-4231-b7f8-a58ccf57c6ee\") " Dec 01 10:38:36 crc kubenswrapper[4867]: I1201 10:38:36.064228 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d664552c-f492-4231-b7f8-a58ccf57c6ee-host" (OuterVolumeSpecName: "host") pod "d664552c-f492-4231-b7f8-a58ccf57c6ee" (UID: "d664552c-f492-4231-b7f8-a58ccf57c6ee"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:38:36 crc kubenswrapper[4867]: I1201 10:38:36.069325 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d664552c-f492-4231-b7f8-a58ccf57c6ee-kube-api-access-6gcrz" (OuterVolumeSpecName: "kube-api-access-6gcrz") pod "d664552c-f492-4231-b7f8-a58ccf57c6ee" (UID: "d664552c-f492-4231-b7f8-a58ccf57c6ee"). InnerVolumeSpecName "kube-api-access-6gcrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:38:36 crc kubenswrapper[4867]: I1201 10:38:36.165680 4867 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d664552c-f492-4231-b7f8-a58ccf57c6ee-host\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:36 crc kubenswrapper[4867]: I1201 10:38:36.165716 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gcrz\" (UniqueName: \"kubernetes.io/projected/d664552c-f492-4231-b7f8-a58ccf57c6ee-kube-api-access-6gcrz\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:36 crc kubenswrapper[4867]: I1201 10:38:36.840619 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d664552c-f492-4231-b7f8-a58ccf57c6ee" path="/var/lib/kubelet/pods/d664552c-f492-4231-b7f8-a58ccf57c6ee/volumes" Dec 01 10:38:36 crc kubenswrapper[4867]: I1201 10:38:36.859312 4867 scope.go:117] "RemoveContainer" containerID="fb51a3e0ebb97c1f98d3feea3604e333b9ac6595d207a1bfbb256e5a8f855a7d" Dec 01 10:38:36 crc kubenswrapper[4867]: I1201 10:38:36.859480 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9h2dq/crc-debug-qztw7" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.195507 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9h2dq/crc-debug-gwvrn"] Dec 01 10:38:37 crc kubenswrapper[4867]: E1201 10:38:37.196122 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a381b2-9bd6-4163-8d9a-a9059d3dedd9" containerName="extract-content" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.196135 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a381b2-9bd6-4163-8d9a-a9059d3dedd9" containerName="extract-content" Dec 01 10:38:37 crc kubenswrapper[4867]: E1201 10:38:37.196152 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d664552c-f492-4231-b7f8-a58ccf57c6ee" containerName="container-00" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.196158 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="d664552c-f492-4231-b7f8-a58ccf57c6ee" containerName="container-00" Dec 01 10:38:37 crc kubenswrapper[4867]: E1201 10:38:37.196183 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a381b2-9bd6-4163-8d9a-a9059d3dedd9" containerName="registry-server" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.196190 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a381b2-9bd6-4163-8d9a-a9059d3dedd9" containerName="registry-server" Dec 01 10:38:37 crc kubenswrapper[4867]: E1201 10:38:37.196200 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c3f527-e2e1-4a92-b2dc-76cb294f84a6" containerName="extract-content" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.196206 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c3f527-e2e1-4a92-b2dc-76cb294f84a6" containerName="extract-content" Dec 01 10:38:37 crc kubenswrapper[4867]: E1201 10:38:37.196212 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c3f527-e2e1-4a92-b2dc-76cb294f84a6" containerName="registry-server" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.196218 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c3f527-e2e1-4a92-b2dc-76cb294f84a6" containerName="registry-server" Dec 01 10:38:37 crc kubenswrapper[4867]: E1201 10:38:37.196228 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a381b2-9bd6-4163-8d9a-a9059d3dedd9" containerName="extract-utilities" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.196236 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a381b2-9bd6-4163-8d9a-a9059d3dedd9" containerName="extract-utilities" Dec 01 10:38:37 crc kubenswrapper[4867]: E1201 10:38:37.196250 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c3f527-e2e1-4a92-b2dc-76cb294f84a6" containerName="extract-utilities" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.196255 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c3f527-e2e1-4a92-b2dc-76cb294f84a6" containerName="extract-utilities" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.196418 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7c3f527-e2e1-4a92-b2dc-76cb294f84a6" containerName="registry-server" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.196427 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="61a381b2-9bd6-4163-8d9a-a9059d3dedd9" containerName="registry-server" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.196438 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="d664552c-f492-4231-b7f8-a58ccf57c6ee" containerName="container-00" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.197083 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9h2dq/crc-debug-gwvrn" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.201738 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9h2dq"/"default-dockercfg-fb4gp" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.289419 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hnpz\" (UniqueName: \"kubernetes.io/projected/3bd96880-bdff-4f8e-b392-6fe1a5ca2915-kube-api-access-9hnpz\") pod \"crc-debug-gwvrn\" (UID: \"3bd96880-bdff-4f8e-b392-6fe1a5ca2915\") " pod="openshift-must-gather-9h2dq/crc-debug-gwvrn" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.289500 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bd96880-bdff-4f8e-b392-6fe1a5ca2915-host\") pod \"crc-debug-gwvrn\" (UID: \"3bd96880-bdff-4f8e-b392-6fe1a5ca2915\") " pod="openshift-must-gather-9h2dq/crc-debug-gwvrn" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.391346 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bd96880-bdff-4f8e-b392-6fe1a5ca2915-host\") pod \"crc-debug-gwvrn\" (UID: \"3bd96880-bdff-4f8e-b392-6fe1a5ca2915\") " pod="openshift-must-gather-9h2dq/crc-debug-gwvrn" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.391525 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bd96880-bdff-4f8e-b392-6fe1a5ca2915-host\") pod \"crc-debug-gwvrn\" (UID: \"3bd96880-bdff-4f8e-b392-6fe1a5ca2915\") " pod="openshift-must-gather-9h2dq/crc-debug-gwvrn" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.391805 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hnpz\" (UniqueName: \"kubernetes.io/projected/3bd96880-bdff-4f8e-b392-6fe1a5ca2915-kube-api-access-9hnpz\") pod \"crc-debug-gwvrn\" (UID: \"3bd96880-bdff-4f8e-b392-6fe1a5ca2915\") " pod="openshift-must-gather-9h2dq/crc-debug-gwvrn" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.411778 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hnpz\" (UniqueName: \"kubernetes.io/projected/3bd96880-bdff-4f8e-b392-6fe1a5ca2915-kube-api-access-9hnpz\") pod \"crc-debug-gwvrn\" (UID: \"3bd96880-bdff-4f8e-b392-6fe1a5ca2915\") " pod="openshift-must-gather-9h2dq/crc-debug-gwvrn" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.520948 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9h2dq/crc-debug-gwvrn" Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.884440 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9h2dq/crc-debug-gwvrn" event={"ID":"3bd96880-bdff-4f8e-b392-6fe1a5ca2915","Type":"ContainerStarted","Data":"8561f0a7a281413dc97a8ee0e6ed54aba8fb6029cf854f1b2a1d9cef16b2c516"} Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.884710 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9h2dq/crc-debug-gwvrn" event={"ID":"3bd96880-bdff-4f8e-b392-6fe1a5ca2915","Type":"ContainerStarted","Data":"2c49f6b2630efcf9962bf6fa6b253f58bc59610bb7c79ea888592ce289686afe"} Dec 01 10:38:37 crc kubenswrapper[4867]: I1201 10:38:37.917153 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9h2dq/crc-debug-gwvrn" podStartSLOduration=0.917132777 podStartE2EDuration="917.132777ms" podCreationTimestamp="2025-12-01 10:38:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:38:37.904510281 +0000 UTC m=+5439.363897045" watchObservedRunningTime="2025-12-01 10:38:37.917132777 +0000 UTC m=+5439.376519531" Dec 01 10:38:38 crc kubenswrapper[4867]: I1201 10:38:38.899082 4867 generic.go:334] "Generic (PLEG): container finished" podID="3bd96880-bdff-4f8e-b392-6fe1a5ca2915" containerID="8561f0a7a281413dc97a8ee0e6ed54aba8fb6029cf854f1b2a1d9cef16b2c516" exitCode=0 Dec 01 10:38:38 crc kubenswrapper[4867]: I1201 10:38:38.899126 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9h2dq/crc-debug-gwvrn" event={"ID":"3bd96880-bdff-4f8e-b392-6fe1a5ca2915","Type":"ContainerDied","Data":"8561f0a7a281413dc97a8ee0e6ed54aba8fb6029cf854f1b2a1d9cef16b2c516"} Dec 01 10:38:39 crc kubenswrapper[4867]: I1201 10:38:39.998891 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9h2dq/crc-debug-gwvrn" Dec 01 10:38:40 crc kubenswrapper[4867]: I1201 10:38:40.079328 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9h2dq/crc-debug-gwvrn"] Dec 01 10:38:40 crc kubenswrapper[4867]: I1201 10:38:40.091979 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9h2dq/crc-debug-gwvrn"] Dec 01 10:38:40 crc kubenswrapper[4867]: I1201 10:38:40.142053 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hnpz\" (UniqueName: \"kubernetes.io/projected/3bd96880-bdff-4f8e-b392-6fe1a5ca2915-kube-api-access-9hnpz\") pod \"3bd96880-bdff-4f8e-b392-6fe1a5ca2915\" (UID: \"3bd96880-bdff-4f8e-b392-6fe1a5ca2915\") " Dec 01 10:38:40 crc kubenswrapper[4867]: I1201 10:38:40.142232 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bd96880-bdff-4f8e-b392-6fe1a5ca2915-host\") pod \"3bd96880-bdff-4f8e-b392-6fe1a5ca2915\" (UID: \"3bd96880-bdff-4f8e-b392-6fe1a5ca2915\") " Dec 01 10:38:40 crc kubenswrapper[4867]: I1201 10:38:40.142335 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3bd96880-bdff-4f8e-b392-6fe1a5ca2915-host" (OuterVolumeSpecName: "host") pod "3bd96880-bdff-4f8e-b392-6fe1a5ca2915" (UID: "3bd96880-bdff-4f8e-b392-6fe1a5ca2915"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:38:40 crc kubenswrapper[4867]: I1201 10:38:40.142898 4867 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bd96880-bdff-4f8e-b392-6fe1a5ca2915-host\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:40 crc kubenswrapper[4867]: I1201 10:38:40.152439 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd96880-bdff-4f8e-b392-6fe1a5ca2915-kube-api-access-9hnpz" (OuterVolumeSpecName: "kube-api-access-9hnpz") pod "3bd96880-bdff-4f8e-b392-6fe1a5ca2915" (UID: "3bd96880-bdff-4f8e-b392-6fe1a5ca2915"). InnerVolumeSpecName "kube-api-access-9hnpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:38:40 crc kubenswrapper[4867]: I1201 10:38:40.244638 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hnpz\" (UniqueName: \"kubernetes.io/projected/3bd96880-bdff-4f8e-b392-6fe1a5ca2915-kube-api-access-9hnpz\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:40 crc kubenswrapper[4867]: I1201 10:38:40.839553 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bd96880-bdff-4f8e-b392-6fe1a5ca2915" path="/var/lib/kubelet/pods/3bd96880-bdff-4f8e-b392-6fe1a5ca2915/volumes" Dec 01 10:38:40 crc kubenswrapper[4867]: I1201 10:38:40.918993 4867 scope.go:117] "RemoveContainer" containerID="8561f0a7a281413dc97a8ee0e6ed54aba8fb6029cf854f1b2a1d9cef16b2c516" Dec 01 10:38:40 crc kubenswrapper[4867]: I1201 10:38:40.919067 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9h2dq/crc-debug-gwvrn" Dec 01 10:38:41 crc kubenswrapper[4867]: I1201 10:38:41.272493 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9h2dq/crc-debug-qjk72"] Dec 01 10:38:41 crc kubenswrapper[4867]: E1201 10:38:41.274008 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd96880-bdff-4f8e-b392-6fe1a5ca2915" containerName="container-00" Dec 01 10:38:41 crc kubenswrapper[4867]: I1201 10:38:41.274116 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd96880-bdff-4f8e-b392-6fe1a5ca2915" containerName="container-00" Dec 01 10:38:41 crc kubenswrapper[4867]: I1201 10:38:41.274403 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd96880-bdff-4f8e-b392-6fe1a5ca2915" containerName="container-00" Dec 01 10:38:41 crc kubenswrapper[4867]: I1201 10:38:41.275297 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9h2dq/crc-debug-qjk72" Dec 01 10:38:41 crc kubenswrapper[4867]: I1201 10:38:41.280030 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9h2dq"/"default-dockercfg-fb4gp" Dec 01 10:38:41 crc kubenswrapper[4867]: I1201 10:38:41.368071 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsgk7\" (UniqueName: \"kubernetes.io/projected/fc1e5dca-fb58-4128-8721-4bb5e86ce405-kube-api-access-rsgk7\") pod \"crc-debug-qjk72\" (UID: \"fc1e5dca-fb58-4128-8721-4bb5e86ce405\") " pod="openshift-must-gather-9h2dq/crc-debug-qjk72" Dec 01 10:38:41 crc kubenswrapper[4867]: I1201 10:38:41.368351 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc1e5dca-fb58-4128-8721-4bb5e86ce405-host\") pod \"crc-debug-qjk72\" (UID: \"fc1e5dca-fb58-4128-8721-4bb5e86ce405\") " pod="openshift-must-gather-9h2dq/crc-debug-qjk72" Dec 01 10:38:41 crc kubenswrapper[4867]: I1201 10:38:41.471367 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsgk7\" (UniqueName: \"kubernetes.io/projected/fc1e5dca-fb58-4128-8721-4bb5e86ce405-kube-api-access-rsgk7\") pod \"crc-debug-qjk72\" (UID: \"fc1e5dca-fb58-4128-8721-4bb5e86ce405\") " pod="openshift-must-gather-9h2dq/crc-debug-qjk72" Dec 01 10:38:41 crc kubenswrapper[4867]: I1201 10:38:41.471563 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc1e5dca-fb58-4128-8721-4bb5e86ce405-host\") pod \"crc-debug-qjk72\" (UID: \"fc1e5dca-fb58-4128-8721-4bb5e86ce405\") " pod="openshift-must-gather-9h2dq/crc-debug-qjk72" Dec 01 10:38:41 crc kubenswrapper[4867]: I1201 10:38:41.471777 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc1e5dca-fb58-4128-8721-4bb5e86ce405-host\") pod \"crc-debug-qjk72\" (UID: \"fc1e5dca-fb58-4128-8721-4bb5e86ce405\") " pod="openshift-must-gather-9h2dq/crc-debug-qjk72" Dec 01 10:38:41 crc kubenswrapper[4867]: I1201 10:38:41.508684 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsgk7\" (UniqueName: \"kubernetes.io/projected/fc1e5dca-fb58-4128-8721-4bb5e86ce405-kube-api-access-rsgk7\") pod \"crc-debug-qjk72\" (UID: \"fc1e5dca-fb58-4128-8721-4bb5e86ce405\") " pod="openshift-must-gather-9h2dq/crc-debug-qjk72" Dec 01 10:38:41 crc kubenswrapper[4867]: I1201 10:38:41.598685 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9h2dq/crc-debug-qjk72" Dec 01 10:38:41 crc kubenswrapper[4867]: W1201 10:38:41.639393 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc1e5dca_fb58_4128_8721_4bb5e86ce405.slice/crio-ace02e75597235640de5dc4d6775f9f6380572374a5396c74e2b3edfbe78b427 WatchSource:0}: Error finding container ace02e75597235640de5dc4d6775f9f6380572374a5396c74e2b3edfbe78b427: Status 404 returned error can't find the container with id ace02e75597235640de5dc4d6775f9f6380572374a5396c74e2b3edfbe78b427 Dec 01 10:38:41 crc kubenswrapper[4867]: I1201 10:38:41.938060 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9h2dq/crc-debug-qjk72" event={"ID":"fc1e5dca-fb58-4128-8721-4bb5e86ce405","Type":"ContainerStarted","Data":"8e57cd2d0e27b58f54ea580a0e56bb7128cc0aea03141a1bd3f2c1380d1db8ac"} Dec 01 10:38:41 crc kubenswrapper[4867]: I1201 10:38:41.938107 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9h2dq/crc-debug-qjk72" event={"ID":"fc1e5dca-fb58-4128-8721-4bb5e86ce405","Type":"ContainerStarted","Data":"ace02e75597235640de5dc4d6775f9f6380572374a5396c74e2b3edfbe78b427"} Dec 01 10:38:41 crc kubenswrapper[4867]: I1201 10:38:41.991142 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9h2dq/crc-debug-qjk72"] Dec 01 10:38:42 crc kubenswrapper[4867]: I1201 10:38:42.005668 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9h2dq/crc-debug-qjk72"] Dec 01 10:38:42 crc kubenswrapper[4867]: I1201 10:38:42.959236 4867 generic.go:334] "Generic (PLEG): container finished" podID="fc1e5dca-fb58-4128-8721-4bb5e86ce405" containerID="8e57cd2d0e27b58f54ea580a0e56bb7128cc0aea03141a1bd3f2c1380d1db8ac" exitCode=0 Dec 01 10:38:43 crc kubenswrapper[4867]: I1201 10:38:43.063477 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9h2dq/crc-debug-qjk72" Dec 01 10:38:43 crc kubenswrapper[4867]: I1201 10:38:43.172415 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc1e5dca-fb58-4128-8721-4bb5e86ce405-host\") pod \"fc1e5dca-fb58-4128-8721-4bb5e86ce405\" (UID: \"fc1e5dca-fb58-4128-8721-4bb5e86ce405\") " Dec 01 10:38:43 crc kubenswrapper[4867]: I1201 10:38:43.172562 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc1e5dca-fb58-4128-8721-4bb5e86ce405-host" (OuterVolumeSpecName: "host") pod "fc1e5dca-fb58-4128-8721-4bb5e86ce405" (UID: "fc1e5dca-fb58-4128-8721-4bb5e86ce405"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:38:43 crc kubenswrapper[4867]: I1201 10:38:43.172721 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsgk7\" (UniqueName: \"kubernetes.io/projected/fc1e5dca-fb58-4128-8721-4bb5e86ce405-kube-api-access-rsgk7\") pod \"fc1e5dca-fb58-4128-8721-4bb5e86ce405\" (UID: \"fc1e5dca-fb58-4128-8721-4bb5e86ce405\") " Dec 01 10:38:43 crc kubenswrapper[4867]: I1201 10:38:43.173777 4867 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc1e5dca-fb58-4128-8721-4bb5e86ce405-host\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:43 crc kubenswrapper[4867]: I1201 10:38:43.188519 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc1e5dca-fb58-4128-8721-4bb5e86ce405-kube-api-access-rsgk7" (OuterVolumeSpecName: "kube-api-access-rsgk7") pod "fc1e5dca-fb58-4128-8721-4bb5e86ce405" (UID: "fc1e5dca-fb58-4128-8721-4bb5e86ce405"). InnerVolumeSpecName "kube-api-access-rsgk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:38:43 crc kubenswrapper[4867]: I1201 10:38:43.276282 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsgk7\" (UniqueName: \"kubernetes.io/projected/fc1e5dca-fb58-4128-8721-4bb5e86ce405-kube-api-access-rsgk7\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:43 crc kubenswrapper[4867]: I1201 10:38:43.971330 4867 scope.go:117] "RemoveContainer" containerID="8e57cd2d0e27b58f54ea580a0e56bb7128cc0aea03141a1bd3f2c1380d1db8ac" Dec 01 10:38:43 crc kubenswrapper[4867]: I1201 10:38:43.971406 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9h2dq/crc-debug-qjk72" Dec 01 10:38:44 crc kubenswrapper[4867]: I1201 10:38:44.840942 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc1e5dca-fb58-4128-8721-4bb5e86ce405" path="/var/lib/kubelet/pods/fc1e5dca-fb58-4128-8721-4bb5e86ce405/volumes" Dec 01 10:38:51 crc kubenswrapper[4867]: I1201 10:38:51.602184 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:38:51 crc kubenswrapper[4867]: I1201 10:38:51.602805 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:39:09 crc kubenswrapper[4867]: I1201 10:39:09.646958 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b97bc66cd-p4vv6_f948002f-f1df-40b5-8fcc-db28284c2609/barbican-api/0.log" Dec 01 10:39:09 crc kubenswrapper[4867]: I1201 10:39:09.808108 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b97bc66cd-p4vv6_f948002f-f1df-40b5-8fcc-db28284c2609/barbican-api-log/0.log" Dec 01 10:39:09 crc kubenswrapper[4867]: I1201 10:39:09.883907 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f9f6fdc98-l7cht_a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2/barbican-keystone-listener/0.log" Dec 01 10:39:10 crc kubenswrapper[4867]: I1201 10:39:10.038772 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f9f6fdc98-l7cht_a1d8136b-aa0f-4cbe-b56a-6151d5ab8ce2/barbican-keystone-listener-log/0.log" Dec 01 10:39:10 crc kubenswrapper[4867]: I1201 10:39:10.067959 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65fbb9cf75-989xz_8469f9a0-94d4-4c2c-839a-80d619a2d984/barbican-worker/0.log" Dec 01 10:39:10 crc kubenswrapper[4867]: I1201 10:39:10.128186 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65fbb9cf75-989xz_8469f9a0-94d4-4c2c-839a-80d619a2d984/barbican-worker-log/0.log" Dec 01 10:39:10 crc kubenswrapper[4867]: I1201 10:39:10.330505 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-pzxzw_a32a973f-6473-444b-a71a-d848773d8de2/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:10 crc kubenswrapper[4867]: I1201 10:39:10.463514 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2699b818-66ce-4531-9084-e599305630ed/ceilometer-central-agent/0.log" Dec 01 10:39:10 crc kubenswrapper[4867]: I1201 10:39:10.547773 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2699b818-66ce-4531-9084-e599305630ed/ceilometer-notification-agent/0.log" Dec 01 10:39:10 crc kubenswrapper[4867]: I1201 10:39:10.558796 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2699b818-66ce-4531-9084-e599305630ed/proxy-httpd/0.log" Dec 01 10:39:10 crc kubenswrapper[4867]: I1201 10:39:10.655661 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2699b818-66ce-4531-9084-e599305630ed/sg-core/0.log" Dec 01 10:39:10 crc kubenswrapper[4867]: I1201 10:39:10.824051 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c71e5b77-e090-4fdd-a254-387c5f9c5fba/cinder-api-log/0.log" Dec 01 10:39:10 crc kubenswrapper[4867]: I1201 10:39:10.900074 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c71e5b77-e090-4fdd-a254-387c5f9c5fba/cinder-api/0.log" Dec 01 10:39:11 crc kubenswrapper[4867]: I1201 10:39:11.269641 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9fe6c397-9427-4440-9d14-b0397c62f8ea/probe/0.log" Dec 01 10:39:11 crc kubenswrapper[4867]: I1201 10:39:11.283916 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9fe6c397-9427-4440-9d14-b0397c62f8ea/cinder-scheduler/0.log" Dec 01 10:39:11 crc kubenswrapper[4867]: I1201 10:39:11.404774 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-44ppb_d598d0dc-37e0-47ac-8fcd-597f70f1300b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:11 crc kubenswrapper[4867]: I1201 10:39:11.547796 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-8xnwq_828b404a-aff1-4642-8893-d0ba513e520d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:11 crc kubenswrapper[4867]: I1201 10:39:11.677482 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-94nfw_cfe6379b-a971-4c4b-9cba-75f2f56de0b1/init/0.log" Dec 01 10:39:11 crc kubenswrapper[4867]: I1201 10:39:11.816728 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-94nfw_cfe6379b-a971-4c4b-9cba-75f2f56de0b1/init/0.log" Dec 01 10:39:11 crc kubenswrapper[4867]: I1201 10:39:11.931552 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-vl4ws_eb0d277f-4c89-46d6-8e05-e72c291e30cc/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:12 crc kubenswrapper[4867]: I1201 10:39:12.002293 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-94nfw_cfe6379b-a971-4c4b-9cba-75f2f56de0b1/dnsmasq-dns/0.log" Dec 01 10:39:12 crc kubenswrapper[4867]: I1201 10:39:12.240103 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_1b458da1-78cd-4603-936e-e60b83594fad/glance-log/0.log" Dec 01 10:39:12 crc kubenswrapper[4867]: I1201 10:39:12.261587 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_1b458da1-78cd-4603-936e-e60b83594fad/glance-httpd/0.log" Dec 01 10:39:12 crc kubenswrapper[4867]: I1201 10:39:12.371296 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5f529607-d9e3-4605-8428-5903a9bab379/glance-httpd/0.log" Dec 01 10:39:12 crc kubenswrapper[4867]: I1201 10:39:12.431525 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5f529607-d9e3-4605-8428-5903a9bab379/glance-log/0.log" Dec 01 10:39:12 crc kubenswrapper[4867]: I1201 10:39:12.524082 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-d47c7cb76-srf4p_8bd4fac2-df2c-4aab-bf00-99b54a83ddca/horizon/2.log" Dec 01 10:39:12 crc kubenswrapper[4867]: I1201 10:39:12.748935 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-d47c7cb76-srf4p_8bd4fac2-df2c-4aab-bf00-99b54a83ddca/horizon/1.log" Dec 01 10:39:12 crc kubenswrapper[4867]: I1201 10:39:12.848739 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-b8r49_c4e1c416-5403-4334-bf63-019f8546a2ab/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:13 crc kubenswrapper[4867]: I1201 10:39:13.101504 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-ktd88_93968ab3-45b8-4b7a-a395-8344714bb9e9/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:13 crc kubenswrapper[4867]: I1201 10:39:13.211407 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-d47c7cb76-srf4p_8bd4fac2-df2c-4aab-bf00-99b54a83ddca/horizon-log/0.log" Dec 01 10:39:13 crc kubenswrapper[4867]: I1201 10:39:13.556384 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29409721-k9xqv_9a2a65a7-bbfb-40ee-bfe2-f99d1173daef/keystone-cron/0.log" Dec 01 10:39:13 crc kubenswrapper[4867]: I1201 10:39:13.764443 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_3d78f955-151d-46a9-9ef3-183051c318e6/kube-state-metrics/0.log" Dec 01 10:39:13 crc kubenswrapper[4867]: I1201 10:39:13.910109 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7b56ffdd7f-kp95s_fe5cd024-6dbb-4ecf-8ea8-147e8f8a5ea0/keystone-api/0.log" Dec 01 10:39:14 crc kubenswrapper[4867]: I1201 10:39:14.032599 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-qzjpp_3e0cef16-29f1-49cf-aee1-7c5d9963aa81/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:14 crc kubenswrapper[4867]: I1201 10:39:14.460327 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-qgp66_27c456e0-7f00-42b5-b4e7-c5120389d2c1/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:14 crc kubenswrapper[4867]: I1201 10:39:14.653321 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-59b9c878df-5k6nq_7dea6dbd-f761-4336-b755-0a2c82f6c66b/neutron-httpd/0.log" Dec 01 10:39:14 crc kubenswrapper[4867]: I1201 10:39:14.914398 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-59b9c878df-5k6nq_7dea6dbd-f761-4336-b755-0a2c82f6c66b/neutron-api/0.log" Dec 01 10:39:15 crc kubenswrapper[4867]: I1201 10:39:15.661227 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_13963b70-5558-4e19-9b73-555d74be129a/nova-cell1-conductor-conductor/0.log" Dec 01 10:39:15 crc kubenswrapper[4867]: I1201 10:39:15.674477 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_3030542c-dee9-40e5-af75-53a0bbc22301/nova-cell0-conductor-conductor/0.log" Dec 01 10:39:16 crc kubenswrapper[4867]: I1201 10:39:16.127967 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_29c7ce91-10c4-45b8-ba1c-db503ab7d5a7/nova-api-log/0.log" Dec 01 10:39:16 crc kubenswrapper[4867]: I1201 10:39:16.151285 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_2a737446-2c4b-44f1-b660-9e433c5eb2d1/memcached/0.log" Dec 01 10:39:16 crc kubenswrapper[4867]: I1201 10:39:16.156956 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d178f07b-43d0-48ea-a5fe-898f68e80850/nova-cell1-novncproxy-novncproxy/0.log" Dec 01 10:39:16 crc kubenswrapper[4867]: I1201 10:39:16.429276 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_29c7ce91-10c4-45b8-ba1c-db503ab7d5a7/nova-api-api/0.log" Dec 01 10:39:16 crc kubenswrapper[4867]: I1201 10:39:16.441679 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-lmcp6_25628db2-c71e-4e6e-bfa2-d753bfc7fb89/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:16 crc kubenswrapper[4867]: I1201 10:39:16.576730 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2f8e90f5-24d7-406e-a2aa-d44b9e6bac71/nova-metadata-log/0.log" Dec 01 10:39:16 crc kubenswrapper[4867]: I1201 10:39:16.941224 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_31106653-bdaa-49c3-b14c-8eb180b0b2c3/mysql-bootstrap/0.log" Dec 01 10:39:17 crc kubenswrapper[4867]: I1201 10:39:17.069150 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_31106653-bdaa-49c3-b14c-8eb180b0b2c3/mysql-bootstrap/0.log" Dec 01 10:39:17 crc kubenswrapper[4867]: I1201 10:39:17.161170 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_31106653-bdaa-49c3-b14c-8eb180b0b2c3/galera/0.log" Dec 01 10:39:17 crc kubenswrapper[4867]: I1201 10:39:17.223561 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d0ac7269-f887-4d9f-a582-6726a5be70f7/nova-scheduler-scheduler/0.log" Dec 01 10:39:17 crc kubenswrapper[4867]: I1201 10:39:17.349512 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7a36be7a-7b6d-443d-94c6-4b3bdff15ec8/mysql-bootstrap/0.log" Dec 01 10:39:17 crc kubenswrapper[4867]: I1201 10:39:17.622044 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7a36be7a-7b6d-443d-94c6-4b3bdff15ec8/galera/0.log" Dec 01 10:39:17 crc kubenswrapper[4867]: I1201 10:39:17.622983 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7a36be7a-7b6d-443d-94c6-4b3bdff15ec8/mysql-bootstrap/0.log" Dec 01 10:39:17 crc kubenswrapper[4867]: I1201 10:39:17.703771 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_e2fbdcd3-0c11-4681-99af-c9b4fb717637/openstackclient/0.log" Dec 01 10:39:17 crc kubenswrapper[4867]: I1201 10:39:17.870388 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-pfvw2_feff7c40-c771-4824-b3f0-75c4d527044a/openstack-network-exporter/0.log" Dec 01 10:39:18 crc kubenswrapper[4867]: I1201 10:39:18.041116 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2f8e90f5-24d7-406e-a2aa-d44b9e6bac71/nova-metadata-metadata/0.log" Dec 01 10:39:18 crc kubenswrapper[4867]: I1201 10:39:18.076248 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9jsgc_91f709b6-1fa8-40fb-80a0-45ea9510b009/ovsdb-server-init/0.log" Dec 01 10:39:18 crc kubenswrapper[4867]: I1201 10:39:18.228266 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9jsgc_91f709b6-1fa8-40fb-80a0-45ea9510b009/ovsdb-server-init/0.log" Dec 01 10:39:18 crc kubenswrapper[4867]: I1201 10:39:18.291087 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9jsgc_91f709b6-1fa8-40fb-80a0-45ea9510b009/ovs-vswitchd/0.log" Dec 01 10:39:18 crc kubenswrapper[4867]: I1201 10:39:18.318453 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9jsgc_91f709b6-1fa8-40fb-80a0-45ea9510b009/ovsdb-server/0.log" Dec 01 10:39:18 crc kubenswrapper[4867]: I1201 10:39:18.408762 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-vmh2x_aa810b5f-4cad-40cc-9feb-6afc38b56ab1/ovn-controller/0.log" Dec 01 10:39:18 crc kubenswrapper[4867]: I1201 10:39:18.572791 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-wpt2t_5f9d5cec-8d85-4f56-b876-06c32bb0a3e7/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:18 crc kubenswrapper[4867]: I1201 10:39:18.630382 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_826ca141-06c3-45c3-9d5a-e99985971b80/ovn-northd/0.log" Dec 01 10:39:18 crc kubenswrapper[4867]: I1201 10:39:18.654447 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_826ca141-06c3-45c3-9d5a-e99985971b80/openstack-network-exporter/0.log" Dec 01 10:39:18 crc kubenswrapper[4867]: I1201 10:39:18.824934 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b1081877-3550-4ad4-9a89-a5cddfc4ba31/openstack-network-exporter/0.log" Dec 01 10:39:18 crc kubenswrapper[4867]: I1201 10:39:18.848595 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b1081877-3550-4ad4-9a89-a5cddfc4ba31/ovsdbserver-nb/0.log" Dec 01 10:39:18 crc kubenswrapper[4867]: I1201 10:39:18.898899 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df/openstack-network-exporter/0.log" Dec 01 10:39:18 crc kubenswrapper[4867]: I1201 10:39:18.986780 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7e7d37cf-5ebc-46ca-9c1a-c271f5e4d1df/ovsdbserver-sb/0.log" Dec 01 10:39:19 crc kubenswrapper[4867]: I1201 10:39:19.183822 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-68bfcdf768-4dtj7_a57f081c-e4b7-4dbb-a817-4d36052f3145/placement-api/0.log" Dec 01 10:39:19 crc kubenswrapper[4867]: I1201 10:39:19.241969 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1a327b42-8b19-491b-a9ba-2c11f0227183/setup-container/0.log" Dec 01 10:39:19 crc kubenswrapper[4867]: I1201 10:39:19.318078 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-68bfcdf768-4dtj7_a57f081c-e4b7-4dbb-a817-4d36052f3145/placement-log/0.log" Dec 01 10:39:19 crc kubenswrapper[4867]: I1201 10:39:19.785762 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1a327b42-8b19-491b-a9ba-2c11f0227183/setup-container/0.log" Dec 01 10:39:19 crc kubenswrapper[4867]: I1201 10:39:19.795097 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1a327b42-8b19-491b-a9ba-2c11f0227183/rabbitmq/0.log" Dec 01 10:39:19 crc kubenswrapper[4867]: I1201 10:39:19.894555 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e3936faf-3dae-4db5-8851-10c1ebe7673b/setup-container/0.log" Dec 01 10:39:20 crc kubenswrapper[4867]: I1201 10:39:20.138871 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e3936faf-3dae-4db5-8851-10c1ebe7673b/setup-container/0.log" Dec 01 10:39:20 crc kubenswrapper[4867]: I1201 10:39:20.155196 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e3936faf-3dae-4db5-8851-10c1ebe7673b/rabbitmq/0.log" Dec 01 10:39:20 crc kubenswrapper[4867]: I1201 10:39:20.307988 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-xnl4k_4ae71b24-0c2d-46fa-a8e4-3fb8261f6817/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:20 crc kubenswrapper[4867]: I1201 10:39:20.458969 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-xllrm_d8a14454-7ca8-4a2d-8626-5234d29dd688/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:20 crc kubenswrapper[4867]: I1201 10:39:20.473620 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-pk2lp_1560ec78-ac43-47a7-ab73-69a7decf4ed8/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:20 crc kubenswrapper[4867]: I1201 10:39:20.645208 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gp785_612c2304-16fe-4932-824d-6116da3a4fb8/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:20 crc kubenswrapper[4867]: I1201 10:39:20.685799 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-2glfk_b20bdd78-fe72-4ce9-b909-440d2e47e153/ssh-known-hosts-edpm-deployment/0.log" Dec 01 10:39:21 crc kubenswrapper[4867]: I1201 10:39:21.030370 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9dcc6b98f-chkvz_476caa3a-28ba-471d-b4c0-c263c5960a87/proxy-server/0.log" Dec 01 10:39:21 crc kubenswrapper[4867]: I1201 10:39:21.057140 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9dcc6b98f-chkvz_476caa3a-28ba-471d-b4c0-c263c5960a87/proxy-httpd/0.log" Dec 01 10:39:21 crc kubenswrapper[4867]: I1201 10:39:21.155821 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-n24dx_14b301a3-7288-471a-8ca4-cb7f4dca4b96/swift-ring-rebalance/0.log" Dec 01 10:39:21 crc kubenswrapper[4867]: I1201 10:39:21.223783 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/account-reaper/0.log" Dec 01 10:39:21 crc kubenswrapper[4867]: I1201 10:39:21.282540 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/account-auditor/0.log" Dec 01 10:39:21 crc kubenswrapper[4867]: I1201 10:39:21.283195 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/account-replicator/0.log" Dec 01 10:39:21 crc kubenswrapper[4867]: I1201 10:39:21.395096 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/account-server/0.log" Dec 01 10:39:21 crc kubenswrapper[4867]: I1201 10:39:21.425788 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/container-auditor/0.log" Dec 01 10:39:21 crc kubenswrapper[4867]: I1201 10:39:21.486965 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/container-replicator/0.log" Dec 01 10:39:21 crc kubenswrapper[4867]: I1201 10:39:21.507888 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/container-updater/0.log" Dec 01 10:39:21 crc kubenswrapper[4867]: I1201 10:39:21.516559 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/container-server/0.log" Dec 01 10:39:21 crc kubenswrapper[4867]: I1201 10:39:21.601278 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:39:21 crc kubenswrapper[4867]: I1201 10:39:21.601334 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:39:21 crc kubenswrapper[4867]: I1201 10:39:21.676979 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/object-replicator/0.log" Dec 01 10:39:21 crc kubenswrapper[4867]: I1201 10:39:21.735585 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/object-auditor/0.log" Dec 01 10:39:21 crc kubenswrapper[4867]: I1201 10:39:21.744962 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/object-expirer/0.log" Dec 01 10:39:21 crc kubenswrapper[4867]: I1201 10:39:21.770653 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/object-updater/0.log" Dec 01 10:39:21 crc kubenswrapper[4867]: I1201 10:39:21.789554 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/object-server/0.log" Dec 01 10:39:21 crc kubenswrapper[4867]: I1201 10:39:21.891129 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/rsync/0.log" Dec 01 10:39:22 crc kubenswrapper[4867]: I1201 10:39:22.036352 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e0a92c42-8470-4f62-a5ed-37ecd7c2c0b3/swift-recon-cron/0.log" Dec 01 10:39:22 crc kubenswrapper[4867]: I1201 10:39:22.289496 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-c8df6_8a874825-a4d4-446d-b1fe-3317e3b67d55/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:22 crc kubenswrapper[4867]: I1201 10:39:22.335923 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_31b3d747-c383-483d-8919-be1dd3a266b6/tempest-tests-tempest-tests-runner/0.log" Dec 01 10:39:22 crc kubenswrapper[4867]: I1201 10:39:22.487155 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_640e57ba-3d94-41be-bcd7-0c5eeff8a092/test-operator-logs-container/0.log" Dec 01 10:39:22 crc kubenswrapper[4867]: I1201 10:39:22.604017 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-z9rf8_21f6bea0-2abe-4029-8272-f6da0825cf69/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:50 crc kubenswrapper[4867]: I1201 10:39:50.044562 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm_438274bb-f607-4eef-af53-59566f7176d1/util/0.log" Dec 01 10:39:50 crc kubenswrapper[4867]: I1201 10:39:50.301713 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm_438274bb-f607-4eef-af53-59566f7176d1/pull/0.log" Dec 01 10:39:50 crc kubenswrapper[4867]: I1201 10:39:50.323912 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm_438274bb-f607-4eef-af53-59566f7176d1/util/0.log" Dec 01 10:39:50 crc kubenswrapper[4867]: I1201 10:39:50.358232 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm_438274bb-f607-4eef-af53-59566f7176d1/pull/0.log" Dec 01 10:39:50 crc kubenswrapper[4867]: I1201 10:39:50.575137 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm_438274bb-f607-4eef-af53-59566f7176d1/util/0.log" Dec 01 10:39:50 crc kubenswrapper[4867]: I1201 10:39:50.607177 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm_438274bb-f607-4eef-af53-59566f7176d1/extract/0.log" Dec 01 10:39:50 crc kubenswrapper[4867]: I1201 10:39:50.649848 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d2374qqtm_438274bb-f607-4eef-af53-59566f7176d1/pull/0.log" Dec 01 10:39:50 crc kubenswrapper[4867]: I1201 10:39:50.857135 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-nrm56_0e850850-d946-42aa-a035-1bf8dcba402f/kube-rbac-proxy/0.log" Dec 01 10:39:50 crc kubenswrapper[4867]: I1201 10:39:50.949636 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-nrm56_0e850850-d946-42aa-a035-1bf8dcba402f/manager/0.log" Dec 01 10:39:51 crc kubenswrapper[4867]: I1201 10:39:51.021552 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-4wbsd_c10410e7-47b2-4a48-bf7d-440a00afd4b4/kube-rbac-proxy/0.log" Dec 01 10:39:51 crc kubenswrapper[4867]: I1201 10:39:51.138877 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-4wbsd_c10410e7-47b2-4a48-bf7d-440a00afd4b4/manager/0.log" Dec 01 10:39:51 crc kubenswrapper[4867]: I1201 10:39:51.263167 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-9nc4v_0deeeac8-147f-441c-ba67-2e6e9bc32073/manager/0.log" Dec 01 10:39:51 crc kubenswrapper[4867]: I1201 10:39:51.300732 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-9nc4v_0deeeac8-147f-441c-ba67-2e6e9bc32073/kube-rbac-proxy/0.log" Dec 01 10:39:51 crc kubenswrapper[4867]: I1201 10:39:51.421799 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-vktv2_68e139fd-19f5-4033-93b8-4ebf8397b510/kube-rbac-proxy/0.log" Dec 01 10:39:51 crc kubenswrapper[4867]: I1201 10:39:51.601003 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:39:51 crc kubenswrapper[4867]: I1201 10:39:51.601087 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:39:51 crc kubenswrapper[4867]: I1201 10:39:51.601148 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 10:39:51 crc kubenswrapper[4867]: I1201 10:39:51.601973 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7eade69509a6bd7bbb2082e52b0943889c75068ffe4ca36fc1f8c12b2a2b8834"} pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:39:51 crc kubenswrapper[4867]: I1201 10:39:51.602036 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" containerID="cri-o://7eade69509a6bd7bbb2082e52b0943889c75068ffe4ca36fc1f8c12b2a2b8834" gracePeriod=600 Dec 01 10:39:51 crc kubenswrapper[4867]: I1201 10:39:51.632776 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-zdllr_0d369519-2f02-4efe-9deb-885362964597/kube-rbac-proxy/0.log" Dec 01 10:39:51 crc kubenswrapper[4867]: I1201 10:39:51.684700 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-vktv2_68e139fd-19f5-4033-93b8-4ebf8397b510/manager/0.log" Dec 01 10:39:51 crc kubenswrapper[4867]: I1201 10:39:51.741338 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-zdllr_0d369519-2f02-4efe-9deb-885362964597/manager/0.log" Dec 01 10:39:51 crc kubenswrapper[4867]: E1201 10:39:51.768029 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd237749_4cea_4ff6_a374_8da70f9c879a.slice/crio-7eade69509a6bd7bbb2082e52b0943889c75068ffe4ca36fc1f8c12b2a2b8834.scope\": RecentStats: unable to find data in memory cache]" Dec 01 10:39:52 crc kubenswrapper[4867]: I1201 10:39:52.171273 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-p7rms_468cf199-ea48-4a5a-ac34-057670369f66/kube-rbac-proxy/0.log" Dec 01 10:39:52 crc kubenswrapper[4867]: I1201 10:39:52.258876 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-p7rms_468cf199-ea48-4a5a-ac34-057670369f66/manager/0.log" Dec 01 10:39:52 crc kubenswrapper[4867]: I1201 10:39:52.460125 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-24whr_b5f9e64b-a7d0-4437-91ac-f84c2441cd8d/kube-rbac-proxy/0.log" Dec 01 10:39:52 crc kubenswrapper[4867]: I1201 10:39:52.604775 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-b4j75_fd8d1846-f143-4ca0-88df-af3eca96175d/kube-rbac-proxy/0.log" Dec 01 10:39:52 crc kubenswrapper[4867]: I1201 10:39:52.624421 4867 generic.go:334] "Generic (PLEG): container finished" podID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerID="7eade69509a6bd7bbb2082e52b0943889c75068ffe4ca36fc1f8c12b2a2b8834" exitCode=0 Dec 01 10:39:52 crc kubenswrapper[4867]: I1201 10:39:52.624474 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerDied","Data":"7eade69509a6bd7bbb2082e52b0943889c75068ffe4ca36fc1f8c12b2a2b8834"} Dec 01 10:39:52 crc kubenswrapper[4867]: I1201 10:39:52.624506 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerStarted","Data":"0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af"} Dec 01 10:39:52 crc kubenswrapper[4867]: I1201 10:39:52.624524 4867 scope.go:117] "RemoveContainer" containerID="47311e5e1c6739ba73f6967986c08edeb174155a12ca11ee181a972aa5fa2ccb" Dec 01 10:39:52 crc kubenswrapper[4867]: I1201 10:39:52.752387 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-24whr_b5f9e64b-a7d0-4437-91ac-f84c2441cd8d/manager/0.log" Dec 01 10:39:52 crc kubenswrapper[4867]: I1201 10:39:52.778847 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-b4j75_fd8d1846-f143-4ca0-88df-af3eca96175d/manager/0.log" Dec 01 10:39:52 crc kubenswrapper[4867]: I1201 10:39:52.898555 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-492tf_07b4a3c9-b63d-4a6f-9227-e2cd767f9d9a/kube-rbac-proxy/0.log" Dec 01 10:39:53 crc kubenswrapper[4867]: I1201 10:39:53.044442 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-492tf_07b4a3c9-b63d-4a6f-9227-e2cd767f9d9a/manager/0.log" Dec 01 10:39:53 crc kubenswrapper[4867]: I1201 10:39:53.184243 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-hlksd_a8956c5b-7421-4442-8d62-773a5fe02fd0/manager/0.log" Dec 01 10:39:53 crc kubenswrapper[4867]: I1201 10:39:53.244142 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-hlksd_a8956c5b-7421-4442-8d62-773a5fe02fd0/kube-rbac-proxy/0.log" Dec 01 10:39:53 crc kubenswrapper[4867]: I1201 10:39:53.393880 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-77sbx_cb8d2624-ad08-41e7-bb2a-48bc75a2dd62/kube-rbac-proxy/0.log" Dec 01 10:39:53 crc kubenswrapper[4867]: I1201 10:39:53.549464 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-twg2p_30c79a23-86f2-4a05-adde-41ada03e2e7e/kube-rbac-proxy/0.log" Dec 01 10:39:53 crc kubenswrapper[4867]: I1201 10:39:53.586770 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-77sbx_cb8d2624-ad08-41e7-bb2a-48bc75a2dd62/manager/0.log" Dec 01 10:39:53 crc kubenswrapper[4867]: I1201 10:39:53.703392 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-twg2p_30c79a23-86f2-4a05-adde-41ada03e2e7e/manager/0.log" Dec 01 10:39:53 crc kubenswrapper[4867]: I1201 10:39:53.808455 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-sjmfh_656a9362-30cf-43f6-9909-95859bef129e/kube-rbac-proxy/0.log" Dec 01 10:39:53 crc kubenswrapper[4867]: I1201 10:39:53.916196 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-sjmfh_656a9362-30cf-43f6-9909-95859bef129e/manager/0.log" Dec 01 10:39:54 crc kubenswrapper[4867]: I1201 10:39:54.113155 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-mxkvs_e9fd074d-b9bc-4215-bfd7-56df604f101c/kube-rbac-proxy/0.log" Dec 01 10:39:54 crc kubenswrapper[4867]: I1201 10:39:54.126218 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-mxkvs_e9fd074d-b9bc-4215-bfd7-56df604f101c/manager/0.log" Dec 01 10:39:54 crc kubenswrapper[4867]: I1201 10:39:54.303687 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4_4d73996e-90d0-44f5-85f9-3800f54fc3d7/kube-rbac-proxy/0.log" Dec 01 10:39:54 crc kubenswrapper[4867]: I1201 10:39:54.355349 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd45p5j4_4d73996e-90d0-44f5-85f9-3800f54fc3d7/manager/0.log" Dec 01 10:39:54 crc kubenswrapper[4867]: I1201 10:39:54.977524 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zvsh9_cbb9c171-f076-44a2-9a0a-fafd9aa101ca/registry-server/0.log" Dec 01 10:39:54 crc kubenswrapper[4867]: I1201 10:39:54.988349 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-66fc949795-bpdpc_8072c3c3-367c-47af-952b-f303a97d1afe/operator/0.log" Dec 01 10:39:55 crc kubenswrapper[4867]: I1201 10:39:55.382928 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-grrzp_461764b0-73a3-4866-aec1-e687293591e3/kube-rbac-proxy/0.log" Dec 01 10:39:55 crc kubenswrapper[4867]: I1201 10:39:55.386367 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-grrzp_461764b0-73a3-4866-aec1-e687293591e3/manager/0.log" Dec 01 10:39:55 crc kubenswrapper[4867]: I1201 10:39:55.619181 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-bhgk8_ffbd9e52-147b-42cd-abaa-ec7d1341b826/manager/0.log" Dec 01 10:39:55 crc kubenswrapper[4867]: I1201 10:39:55.703040 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-bhgk8_ffbd9e52-147b-42cd-abaa-ec7d1341b826/kube-rbac-proxy/0.log" Dec 01 10:39:55 crc kubenswrapper[4867]: I1201 10:39:55.728930 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-68x8r_bb2d9cc0-c5d4-4abe-874e-8ce801c6cbdf/operator/0.log" Dec 01 10:39:55 crc kubenswrapper[4867]: I1201 10:39:55.824698 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-56cfc94774-wn77q_860dbd82-4e88-4090-8ce6-658e3201ef67/manager/0.log" Dec 01 10:39:56 crc kubenswrapper[4867]: I1201 10:39:56.004559 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-j698r_f3176675-0a3a-4fd2-9727-349ec1b88de7/kube-rbac-proxy/0.log" Dec 01 10:39:56 crc kubenswrapper[4867]: I1201 10:39:56.065459 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-j698r_f3176675-0a3a-4fd2-9727-349ec1b88de7/manager/0.log" Dec 01 10:39:56 crc kubenswrapper[4867]: I1201 10:39:56.093229 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-l7jwc_9be92a6c-afef-449e-927b-8d0732a2140a/kube-rbac-proxy/0.log" Dec 01 10:39:56 crc kubenswrapper[4867]: I1201 10:39:56.229257 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-l7jwc_9be92a6c-afef-449e-927b-8d0732a2140a/manager/0.log" Dec 01 10:39:56 crc kubenswrapper[4867]: I1201 10:39:56.531694 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-g2ddn_573accdf-9cb1-4643-af86-744e695a1f9d/kube-rbac-proxy/0.log" Dec 01 10:39:56 crc kubenswrapper[4867]: I1201 10:39:56.583571 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-g2ddn_573accdf-9cb1-4643-af86-744e695a1f9d/manager/0.log" Dec 01 10:39:56 crc kubenswrapper[4867]: I1201 10:39:56.753477 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-swrc5_c900776b-c7ea-4e4d-9b6b-00245cf048ce/kube-rbac-proxy/0.log" Dec 01 10:39:56 crc kubenswrapper[4867]: I1201 10:39:56.775351 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-swrc5_c900776b-c7ea-4e4d-9b6b-00245cf048ce/manager/0.log" Dec 01 10:40:17 crc kubenswrapper[4867]: I1201 10:40:17.817307 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-mph7x_678a6c46-6e4c-4ec0-aa74-7c89c6dc00b5/control-plane-machine-set-operator/0.log" Dec 01 10:40:17 crc kubenswrapper[4867]: I1201 10:40:17.984835 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-545ws_e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a/kube-rbac-proxy/0.log" Dec 01 10:40:18 crc kubenswrapper[4867]: I1201 10:40:18.030506 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-545ws_e0f5215e-ef8c-4b8f-8b3f-ecfb5deac62a/machine-api-operator/0.log" Dec 01 10:40:30 crc kubenswrapper[4867]: I1201 10:40:30.231392 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-78xzp_210c03bc-36ef-4bc0-ba17-db783a56d470/cert-manager-controller/0.log" Dec 01 10:40:30 crc kubenswrapper[4867]: I1201 10:40:30.400213 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-njkbq_6b09ecf0-40b3-4271-97da-a662a4b427d6/cert-manager-cainjector/0.log" Dec 01 10:40:30 crc kubenswrapper[4867]: I1201 10:40:30.450327 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-nwld2_de894f99-4158-4096-b100-4758130c6c12/cert-manager-webhook/0.log" Dec 01 10:40:44 crc kubenswrapper[4867]: I1201 10:40:44.684416 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-fzb6f_f6e4c850-11f6-495b-a90f-5936dda915e7/nmstate-console-plugin/0.log" Dec 01 10:40:44 crc kubenswrapper[4867]: I1201 10:40:44.886201 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-cbknx_7a3fd2df-a271-4ff0-8488-7f442aedf04e/nmstate-handler/0.log" Dec 01 10:40:44 crc kubenswrapper[4867]: I1201 10:40:44.950518 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-dnfqt_4749ce2f-6e1e-47ef-a5f1-bdd63f409214/kube-rbac-proxy/0.log" Dec 01 10:40:44 crc kubenswrapper[4867]: I1201 10:40:44.992142 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-dnfqt_4749ce2f-6e1e-47ef-a5f1-bdd63f409214/nmstate-metrics/0.log" Dec 01 10:40:45 crc kubenswrapper[4867]: I1201 10:40:45.175421 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-wthkz_92290d91-f34b-4ef8-a2a7-15ed05a8c2a5/nmstate-operator/0.log" Dec 01 10:40:45 crc kubenswrapper[4867]: I1201 10:40:45.298910 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-s47c8_e2549111-bcf2-4c87-abdd-0d4cd9353be9/nmstate-webhook/0.log" Dec 01 10:41:01 crc kubenswrapper[4867]: I1201 10:41:01.190282 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-hfnbg_cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a/kube-rbac-proxy/0.log" Dec 01 10:41:01 crc kubenswrapper[4867]: I1201 10:41:01.334413 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-hfnbg_cb7589d4-4f58-4cb7-b79f-bf9ccc224a1a/controller/0.log" Dec 01 10:41:01 crc kubenswrapper[4867]: I1201 10:41:01.464414 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-frr-files/0.log" Dec 01 10:41:01 crc kubenswrapper[4867]: I1201 10:41:01.684478 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-reloader/0.log" Dec 01 10:41:01 crc kubenswrapper[4867]: I1201 10:41:01.687940 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-frr-files/0.log" Dec 01 10:41:01 crc kubenswrapper[4867]: I1201 10:41:01.724001 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-reloader/0.log" Dec 01 10:41:01 crc kubenswrapper[4867]: I1201 10:41:01.763828 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-metrics/0.log" Dec 01 10:41:01 crc kubenswrapper[4867]: I1201 10:41:01.985385 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-metrics/0.log" Dec 01 10:41:02 crc kubenswrapper[4867]: I1201 10:41:02.003409 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-frr-files/0.log" Dec 01 10:41:02 crc kubenswrapper[4867]: I1201 10:41:02.010131 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-reloader/0.log" Dec 01 10:41:02 crc kubenswrapper[4867]: I1201 10:41:02.094702 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-metrics/0.log" Dec 01 10:41:02 crc kubenswrapper[4867]: I1201 10:41:02.301699 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-metrics/0.log" Dec 01 10:41:02 crc kubenswrapper[4867]: I1201 10:41:02.325648 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-frr-files/0.log" Dec 01 10:41:02 crc kubenswrapper[4867]: I1201 10:41:02.341906 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/cp-reloader/0.log" Dec 01 10:41:02 crc kubenswrapper[4867]: I1201 10:41:02.350604 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/controller/0.log" Dec 01 10:41:02 crc kubenswrapper[4867]: I1201 10:41:02.584955 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/kube-rbac-proxy-frr/0.log" Dec 01 10:41:02 crc kubenswrapper[4867]: I1201 10:41:02.729643 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/frr-metrics/0.log" Dec 01 10:41:03 crc kubenswrapper[4867]: I1201 10:41:03.019527 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/kube-rbac-proxy/0.log" Dec 01 10:41:03 crc kubenswrapper[4867]: I1201 10:41:03.284512 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-4bvvs_d2bcb3a5-5fb9-4c77-9f79-6d88033b8669/frr-k8s-webhook-server/0.log" Dec 01 10:41:03 crc kubenswrapper[4867]: I1201 10:41:03.290677 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/reloader/0.log" Dec 01 10:41:03 crc kubenswrapper[4867]: I1201 10:41:03.671094 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6d79d8d46b-pxjk5_82e433dd-78d1-4cb0-a670-e19c67e09515/manager/0.log" Dec 01 10:41:03 crc kubenswrapper[4867]: I1201 10:41:03.894947 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6f489b594c-qqhv4_cb69f179-7caf-472c-9b20-f327c116f4a2/webhook-server/0.log" Dec 01 10:41:04 crc kubenswrapper[4867]: I1201 10:41:04.176339 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stgvd_b1d4168c-add2-4db2-a491-761b0127d5b1/frr/0.log" Dec 01 10:41:04 crc kubenswrapper[4867]: I1201 10:41:04.297329 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bczj5_c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3/kube-rbac-proxy/0.log" Dec 01 10:41:04 crc kubenswrapper[4867]: I1201 10:41:04.560928 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bczj5_c7f81cd9-bd19-4ed2-95d1-f8bb6fc5d6b3/speaker/0.log" Dec 01 10:41:17 crc kubenswrapper[4867]: I1201 10:41:17.315301 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb_eada6e99-1300-49a6-8732-f4b2024526dc/util/0.log" Dec 01 10:41:17 crc kubenswrapper[4867]: I1201 10:41:17.535512 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb_eada6e99-1300-49a6-8732-f4b2024526dc/pull/0.log" Dec 01 10:41:17 crc kubenswrapper[4867]: I1201 10:41:17.561339 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb_eada6e99-1300-49a6-8732-f4b2024526dc/pull/0.log" Dec 01 10:41:17 crc kubenswrapper[4867]: I1201 10:41:17.562705 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb_eada6e99-1300-49a6-8732-f4b2024526dc/util/0.log" Dec 01 10:41:17 crc kubenswrapper[4867]: I1201 10:41:17.748083 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb_eada6e99-1300-49a6-8732-f4b2024526dc/util/0.log" Dec 01 10:41:17 crc kubenswrapper[4867]: I1201 10:41:17.758515 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb_eada6e99-1300-49a6-8732-f4b2024526dc/pull/0.log" Dec 01 10:41:17 crc kubenswrapper[4867]: I1201 10:41:17.799575 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fqtwhb_eada6e99-1300-49a6-8732-f4b2024526dc/extract/0.log" Dec 01 10:41:17 crc kubenswrapper[4867]: I1201 10:41:17.923348 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb_22246b0d-5ca8-4aa8-9cb5-0942b473e733/util/0.log" Dec 01 10:41:18 crc kubenswrapper[4867]: I1201 10:41:18.374538 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb_22246b0d-5ca8-4aa8-9cb5-0942b473e733/pull/0.log" Dec 01 10:41:18 crc kubenswrapper[4867]: I1201 10:41:18.386301 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb_22246b0d-5ca8-4aa8-9cb5-0942b473e733/pull/0.log" Dec 01 10:41:18 crc kubenswrapper[4867]: I1201 10:41:18.390938 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb_22246b0d-5ca8-4aa8-9cb5-0942b473e733/util/0.log" Dec 01 10:41:18 crc kubenswrapper[4867]: I1201 10:41:18.681600 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb_22246b0d-5ca8-4aa8-9cb5-0942b473e733/extract/0.log" Dec 01 10:41:18 crc kubenswrapper[4867]: I1201 10:41:18.712123 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb_22246b0d-5ca8-4aa8-9cb5-0942b473e733/util/0.log" Dec 01 10:41:18 crc kubenswrapper[4867]: I1201 10:41:18.756299 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vdcvb_22246b0d-5ca8-4aa8-9cb5-0942b473e733/pull/0.log" Dec 01 10:41:18 crc kubenswrapper[4867]: I1201 10:41:18.926500 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r6wmp_62ccb15d-a0d6-4799-87e4-99cf2489fa16/extract-utilities/0.log" Dec 01 10:41:19 crc kubenswrapper[4867]: I1201 10:41:19.185457 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r6wmp_62ccb15d-a0d6-4799-87e4-99cf2489fa16/extract-content/0.log" Dec 01 10:41:19 crc kubenswrapper[4867]: I1201 10:41:19.195910 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r6wmp_62ccb15d-a0d6-4799-87e4-99cf2489fa16/extract-utilities/0.log" Dec 01 10:41:19 crc kubenswrapper[4867]: I1201 10:41:19.263022 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r6wmp_62ccb15d-a0d6-4799-87e4-99cf2489fa16/extract-content/0.log" Dec 01 10:41:19 crc kubenswrapper[4867]: I1201 10:41:19.474717 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r6wmp_62ccb15d-a0d6-4799-87e4-99cf2489fa16/extract-content/0.log" Dec 01 10:41:19 crc kubenswrapper[4867]: I1201 10:41:19.498822 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r6wmp_62ccb15d-a0d6-4799-87e4-99cf2489fa16/extract-utilities/0.log" Dec 01 10:41:19 crc kubenswrapper[4867]: I1201 10:41:19.841168 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8x9hh_6c6c316b-bbb7-4e56-bced-aed519dec778/extract-utilities/0.log" Dec 01 10:41:20 crc kubenswrapper[4867]: I1201 10:41:20.006262 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8x9hh_6c6c316b-bbb7-4e56-bced-aed519dec778/extract-utilities/0.log" Dec 01 10:41:20 crc kubenswrapper[4867]: I1201 10:41:20.027618 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8x9hh_6c6c316b-bbb7-4e56-bced-aed519dec778/extract-content/0.log" Dec 01 10:41:20 crc kubenswrapper[4867]: I1201 10:41:20.280652 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8x9hh_6c6c316b-bbb7-4e56-bced-aed519dec778/extract-content/0.log" Dec 01 10:41:20 crc kubenswrapper[4867]: I1201 10:41:20.342929 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r6wmp_62ccb15d-a0d6-4799-87e4-99cf2489fa16/registry-server/0.log" Dec 01 10:41:20 crc kubenswrapper[4867]: I1201 10:41:20.565744 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8x9hh_6c6c316b-bbb7-4e56-bced-aed519dec778/extract-utilities/0.log" Dec 01 10:41:20 crc kubenswrapper[4867]: I1201 10:41:20.638308 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8x9hh_6c6c316b-bbb7-4e56-bced-aed519dec778/extract-content/0.log" Dec 01 10:41:20 crc kubenswrapper[4867]: I1201 10:41:20.865771 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7gpdj_a222161f-afcc-47dc-bc2f-50b228543866/marketplace-operator/0.log" Dec 01 10:41:21 crc kubenswrapper[4867]: I1201 10:41:21.173560 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qqrrr_899e126d-0b32-4d48-b5c4-acc83cea5de4/extract-utilities/0.log" Dec 01 10:41:21 crc kubenswrapper[4867]: I1201 10:41:21.229830 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8x9hh_6c6c316b-bbb7-4e56-bced-aed519dec778/registry-server/0.log" Dec 01 10:41:21 crc kubenswrapper[4867]: I1201 10:41:21.584008 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qqrrr_899e126d-0b32-4d48-b5c4-acc83cea5de4/extract-utilities/0.log" Dec 01 10:41:21 crc kubenswrapper[4867]: I1201 10:41:21.623681 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qqrrr_899e126d-0b32-4d48-b5c4-acc83cea5de4/extract-content/0.log" Dec 01 10:41:21 crc kubenswrapper[4867]: I1201 10:41:21.650214 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qqrrr_899e126d-0b32-4d48-b5c4-acc83cea5de4/extract-content/0.log" Dec 01 10:41:21 crc kubenswrapper[4867]: I1201 10:41:21.851895 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qqrrr_899e126d-0b32-4d48-b5c4-acc83cea5de4/extract-utilities/0.log" Dec 01 10:41:21 crc kubenswrapper[4867]: I1201 10:41:21.861579 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qqrrr_899e126d-0b32-4d48-b5c4-acc83cea5de4/extract-content/0.log" Dec 01 10:41:22 crc kubenswrapper[4867]: I1201 10:41:22.078233 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qqrrr_899e126d-0b32-4d48-b5c4-acc83cea5de4/registry-server/0.log" Dec 01 10:41:22 crc kubenswrapper[4867]: I1201 10:41:22.194003 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-t8fj7_7275a285-256c-48dd-b0d6-b80fc37603b6/extract-utilities/0.log" Dec 01 10:41:22 crc kubenswrapper[4867]: I1201 10:41:22.353291 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-t8fj7_7275a285-256c-48dd-b0d6-b80fc37603b6/extract-utilities/0.log" Dec 01 10:41:22 crc kubenswrapper[4867]: I1201 10:41:22.399421 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-t8fj7_7275a285-256c-48dd-b0d6-b80fc37603b6/extract-content/0.log" Dec 01 10:41:22 crc kubenswrapper[4867]: I1201 10:41:22.426931 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-t8fj7_7275a285-256c-48dd-b0d6-b80fc37603b6/extract-content/0.log" Dec 01 10:41:22 crc kubenswrapper[4867]: I1201 10:41:22.664027 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-t8fj7_7275a285-256c-48dd-b0d6-b80fc37603b6/extract-utilities/0.log" Dec 01 10:41:22 crc kubenswrapper[4867]: I1201 10:41:22.728133 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-t8fj7_7275a285-256c-48dd-b0d6-b80fc37603b6/extract-content/0.log" Dec 01 10:41:22 crc kubenswrapper[4867]: I1201 10:41:22.771966 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-t8fj7_7275a285-256c-48dd-b0d6-b80fc37603b6/registry-server/0.log" Dec 01 10:41:51 crc kubenswrapper[4867]: I1201 10:41:51.601194 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:41:51 crc kubenswrapper[4867]: I1201 10:41:51.601739 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:42:21 crc kubenswrapper[4867]: I1201 10:42:21.601427 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:42:21 crc kubenswrapper[4867]: I1201 10:42:21.602059 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:42:51 crc kubenswrapper[4867]: I1201 10:42:51.601779 4867 patch_prober.go:28] interesting pod/machine-config-daemon-mt9t2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:42:51 crc kubenswrapper[4867]: I1201 10:42:51.602472 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:42:51 crc kubenswrapper[4867]: I1201 10:42:51.602570 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" Dec 01 10:42:51 crc kubenswrapper[4867]: I1201 10:42:51.603417 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af"} pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:42:51 crc kubenswrapper[4867]: I1201 10:42:51.603481 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerName="machine-config-daemon" containerID="cri-o://0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" gracePeriod=600 Dec 01 10:42:51 crc kubenswrapper[4867]: E1201 10:42:51.750988 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:42:52 crc kubenswrapper[4867]: I1201 10:42:52.372433 4867 generic.go:334] "Generic (PLEG): container finished" podID="cd237749-4cea-4ff6-a374-8da70f9c879a" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" exitCode=0 Dec 01 10:42:52 crc kubenswrapper[4867]: I1201 10:42:52.372638 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" event={"ID":"cd237749-4cea-4ff6-a374-8da70f9c879a","Type":"ContainerDied","Data":"0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af"} Dec 01 10:42:52 crc kubenswrapper[4867]: I1201 10:42:52.372790 4867 scope.go:117] "RemoveContainer" containerID="7eade69509a6bd7bbb2082e52b0943889c75068ffe4ca36fc1f8c12b2a2b8834" Dec 01 10:42:52 crc kubenswrapper[4867]: I1201 10:42:52.373649 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:42:52 crc kubenswrapper[4867]: E1201 10:42:52.374049 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:43:05 crc kubenswrapper[4867]: I1201 10:43:05.827526 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:43:05 crc kubenswrapper[4867]: E1201 10:43:05.828353 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:43:20 crc kubenswrapper[4867]: I1201 10:43:20.827977 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:43:20 crc kubenswrapper[4867]: E1201 10:43:20.828877 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:43:34 crc kubenswrapper[4867]: I1201 10:43:34.826604 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:43:34 crc kubenswrapper[4867]: E1201 10:43:34.827287 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:43:36 crc kubenswrapper[4867]: I1201 10:43:36.822950 4867 generic.go:334] "Generic (PLEG): container finished" podID="ddabec67-3daf-413f-9c08-fc02e60e9b67" containerID="edd849514fcd4d0f7c7eaa3a9de642c2773ef1e6e900405a358f089efbfb1fd4" exitCode=0 Dec 01 10:43:36 crc kubenswrapper[4867]: I1201 10:43:36.823079 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9h2dq/must-gather-tl5wz" event={"ID":"ddabec67-3daf-413f-9c08-fc02e60e9b67","Type":"ContainerDied","Data":"edd849514fcd4d0f7c7eaa3a9de642c2773ef1e6e900405a358f089efbfb1fd4"} Dec 01 10:43:36 crc kubenswrapper[4867]: I1201 10:43:36.823802 4867 scope.go:117] "RemoveContainer" containerID="edd849514fcd4d0f7c7eaa3a9de642c2773ef1e6e900405a358f089efbfb1fd4" Dec 01 10:43:37 crc kubenswrapper[4867]: I1201 10:43:37.139011 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9h2dq_must-gather-tl5wz_ddabec67-3daf-413f-9c08-fc02e60e9b67/gather/0.log" Dec 01 10:43:46 crc kubenswrapper[4867]: I1201 10:43:46.827184 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:43:46 crc kubenswrapper[4867]: E1201 10:43:46.827902 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:43:50 crc kubenswrapper[4867]: I1201 10:43:50.682390 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9h2dq/must-gather-tl5wz"] Dec 01 10:43:50 crc kubenswrapper[4867]: I1201 10:43:50.683392 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-9h2dq/must-gather-tl5wz" podUID="ddabec67-3daf-413f-9c08-fc02e60e9b67" containerName="copy" containerID="cri-o://1616903806d2ea3ad7106f47d4102a5445dcf02f2485c1a13cad91c1d7b65740" gracePeriod=2 Dec 01 10:43:50 crc kubenswrapper[4867]: I1201 10:43:50.693537 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9h2dq/must-gather-tl5wz"] Dec 01 10:43:51 crc kubenswrapper[4867]: I1201 10:43:51.015504 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9h2dq_must-gather-tl5wz_ddabec67-3daf-413f-9c08-fc02e60e9b67/copy/0.log" Dec 01 10:43:51 crc kubenswrapper[4867]: I1201 10:43:51.016647 4867 generic.go:334] "Generic (PLEG): container finished" podID="ddabec67-3daf-413f-9c08-fc02e60e9b67" containerID="1616903806d2ea3ad7106f47d4102a5445dcf02f2485c1a13cad91c1d7b65740" exitCode=143 Dec 01 10:43:51 crc kubenswrapper[4867]: I1201 10:43:51.197474 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9h2dq_must-gather-tl5wz_ddabec67-3daf-413f-9c08-fc02e60e9b67/copy/0.log" Dec 01 10:43:51 crc kubenswrapper[4867]: I1201 10:43:51.197949 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9h2dq/must-gather-tl5wz" Dec 01 10:43:51 crc kubenswrapper[4867]: I1201 10:43:51.370542 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ddabec67-3daf-413f-9c08-fc02e60e9b67-must-gather-output\") pod \"ddabec67-3daf-413f-9c08-fc02e60e9b67\" (UID: \"ddabec67-3daf-413f-9c08-fc02e60e9b67\") " Dec 01 10:43:51 crc kubenswrapper[4867]: I1201 10:43:51.370624 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gbkt\" (UniqueName: \"kubernetes.io/projected/ddabec67-3daf-413f-9c08-fc02e60e9b67-kube-api-access-5gbkt\") pod \"ddabec67-3daf-413f-9c08-fc02e60e9b67\" (UID: \"ddabec67-3daf-413f-9c08-fc02e60e9b67\") " Dec 01 10:43:51 crc kubenswrapper[4867]: I1201 10:43:51.385204 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddabec67-3daf-413f-9c08-fc02e60e9b67-kube-api-access-5gbkt" (OuterVolumeSpecName: "kube-api-access-5gbkt") pod "ddabec67-3daf-413f-9c08-fc02e60e9b67" (UID: "ddabec67-3daf-413f-9c08-fc02e60e9b67"). InnerVolumeSpecName "kube-api-access-5gbkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:43:51 crc kubenswrapper[4867]: I1201 10:43:51.472675 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gbkt\" (UniqueName: \"kubernetes.io/projected/ddabec67-3daf-413f-9c08-fc02e60e9b67-kube-api-access-5gbkt\") on node \"crc\" DevicePath \"\"" Dec 01 10:43:51 crc kubenswrapper[4867]: I1201 10:43:51.566892 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddabec67-3daf-413f-9c08-fc02e60e9b67-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ddabec67-3daf-413f-9c08-fc02e60e9b67" (UID: "ddabec67-3daf-413f-9c08-fc02e60e9b67"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:43:51 crc kubenswrapper[4867]: I1201 10:43:51.574870 4867 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ddabec67-3daf-413f-9c08-fc02e60e9b67-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 01 10:43:52 crc kubenswrapper[4867]: I1201 10:43:52.036623 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9h2dq_must-gather-tl5wz_ddabec67-3daf-413f-9c08-fc02e60e9b67/copy/0.log" Dec 01 10:43:52 crc kubenswrapper[4867]: I1201 10:43:52.037281 4867 scope.go:117] "RemoveContainer" containerID="1616903806d2ea3ad7106f47d4102a5445dcf02f2485c1a13cad91c1d7b65740" Dec 01 10:43:52 crc kubenswrapper[4867]: I1201 10:43:52.037345 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9h2dq/must-gather-tl5wz" Dec 01 10:43:52 crc kubenswrapper[4867]: I1201 10:43:52.070530 4867 scope.go:117] "RemoveContainer" containerID="edd849514fcd4d0f7c7eaa3a9de642c2773ef1e6e900405a358f089efbfb1fd4" Dec 01 10:43:52 crc kubenswrapper[4867]: I1201 10:43:52.838364 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddabec67-3daf-413f-9c08-fc02e60e9b67" path="/var/lib/kubelet/pods/ddabec67-3daf-413f-9c08-fc02e60e9b67/volumes" Dec 01 10:43:59 crc kubenswrapper[4867]: I1201 10:43:59.826588 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:43:59 crc kubenswrapper[4867]: E1201 10:43:59.827439 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:44:11 crc kubenswrapper[4867]: I1201 10:44:11.827542 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:44:11 crc kubenswrapper[4867]: E1201 10:44:11.828742 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:44:23 crc kubenswrapper[4867]: I1201 10:44:23.827478 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:44:23 crc kubenswrapper[4867]: E1201 10:44:23.828367 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:44:34 crc kubenswrapper[4867]: I1201 10:44:34.827234 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:44:34 crc kubenswrapper[4867]: E1201 10:44:34.828074 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:44:47 crc kubenswrapper[4867]: I1201 10:44:47.827919 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:44:47 crc kubenswrapper[4867]: E1201 10:44:47.829076 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:44:58 crc kubenswrapper[4867]: I1201 10:44:58.832205 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:44:58 crc kubenswrapper[4867]: E1201 10:44:58.833196 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.142381 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409765-qwmcd"] Dec 01 10:45:00 crc kubenswrapper[4867]: E1201 10:45:00.142863 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc1e5dca-fb58-4128-8721-4bb5e86ce405" containerName="container-00" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.142880 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc1e5dca-fb58-4128-8721-4bb5e86ce405" containerName="container-00" Dec 01 10:45:00 crc kubenswrapper[4867]: E1201 10:45:00.142899 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddabec67-3daf-413f-9c08-fc02e60e9b67" containerName="copy" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.142907 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddabec67-3daf-413f-9c08-fc02e60e9b67" containerName="copy" Dec 01 10:45:00 crc kubenswrapper[4867]: E1201 10:45:00.142945 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddabec67-3daf-413f-9c08-fc02e60e9b67" containerName="gather" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.142964 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddabec67-3daf-413f-9c08-fc02e60e9b67" containerName="gather" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.143253 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc1e5dca-fb58-4128-8721-4bb5e86ce405" containerName="container-00" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.143266 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddabec67-3daf-413f-9c08-fc02e60e9b67" containerName="copy" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.143283 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddabec67-3daf-413f-9c08-fc02e60e9b67" containerName="gather" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.143889 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-qwmcd" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.145897 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.153012 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.166145 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409765-qwmcd"] Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.243071 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f-config-volume\") pod \"collect-profiles-29409765-qwmcd\" (UID: \"b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-qwmcd" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.243318 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x98xk\" (UniqueName: \"kubernetes.io/projected/b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f-kube-api-access-x98xk\") pod \"collect-profiles-29409765-qwmcd\" (UID: \"b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-qwmcd" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.243357 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f-secret-volume\") pod \"collect-profiles-29409765-qwmcd\" (UID: \"b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-qwmcd" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.344882 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x98xk\" (UniqueName: \"kubernetes.io/projected/b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f-kube-api-access-x98xk\") pod \"collect-profiles-29409765-qwmcd\" (UID: \"b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-qwmcd" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.344941 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f-secret-volume\") pod \"collect-profiles-29409765-qwmcd\" (UID: \"b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-qwmcd" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.344992 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f-config-volume\") pod \"collect-profiles-29409765-qwmcd\" (UID: \"b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-qwmcd" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.346316 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f-config-volume\") pod \"collect-profiles-29409765-qwmcd\" (UID: \"b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-qwmcd" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.358900 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f-secret-volume\") pod \"collect-profiles-29409765-qwmcd\" (UID: \"b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-qwmcd" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.361294 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x98xk\" (UniqueName: \"kubernetes.io/projected/b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f-kube-api-access-x98xk\") pod \"collect-profiles-29409765-qwmcd\" (UID: \"b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-qwmcd" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.471554 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-qwmcd" Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.928780 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409765-qwmcd"] Dec 01 10:45:00 crc kubenswrapper[4867]: I1201 10:45:00.953052 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-qwmcd" event={"ID":"b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f","Type":"ContainerStarted","Data":"5d5c8e8e10580501b501c94784d168a78ceea84e1fae71e77aff589899772ff9"} Dec 01 10:45:01 crc kubenswrapper[4867]: I1201 10:45:01.966440 4867 generic.go:334] "Generic (PLEG): container finished" podID="b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f" containerID="efdb12e49d0a7d35de86e993ce75e2851bd0af36f1d3c97ad3daab0bfe218c32" exitCode=0 Dec 01 10:45:01 crc kubenswrapper[4867]: I1201 10:45:01.966700 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-qwmcd" event={"ID":"b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f","Type":"ContainerDied","Data":"efdb12e49d0a7d35de86e993ce75e2851bd0af36f1d3c97ad3daab0bfe218c32"} Dec 01 10:45:03 crc kubenswrapper[4867]: I1201 10:45:03.334672 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-qwmcd" Dec 01 10:45:03 crc kubenswrapper[4867]: I1201 10:45:03.502434 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f-secret-volume\") pod \"b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f\" (UID: \"b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f\") " Dec 01 10:45:03 crc kubenswrapper[4867]: I1201 10:45:03.502543 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f-config-volume\") pod \"b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f\" (UID: \"b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f\") " Dec 01 10:45:03 crc kubenswrapper[4867]: I1201 10:45:03.502642 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x98xk\" (UniqueName: \"kubernetes.io/projected/b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f-kube-api-access-x98xk\") pod \"b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f\" (UID: \"b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f\") " Dec 01 10:45:03 crc kubenswrapper[4867]: I1201 10:45:03.503918 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f-config-volume" (OuterVolumeSpecName: "config-volume") pod "b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f" (UID: "b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:45:03 crc kubenswrapper[4867]: I1201 10:45:03.512011 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f" (UID: "b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:45:03 crc kubenswrapper[4867]: I1201 10:45:03.513805 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f-kube-api-access-x98xk" (OuterVolumeSpecName: "kube-api-access-x98xk") pod "b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f" (UID: "b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f"). InnerVolumeSpecName "kube-api-access-x98xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:45:03 crc kubenswrapper[4867]: I1201 10:45:03.606405 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x98xk\" (UniqueName: \"kubernetes.io/projected/b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f-kube-api-access-x98xk\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:03 crc kubenswrapper[4867]: I1201 10:45:03.606479 4867 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:03 crc kubenswrapper[4867]: I1201 10:45:03.606491 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:03 crc kubenswrapper[4867]: I1201 10:45:03.991669 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-qwmcd" event={"ID":"b07270ef-d95c-4f3c-ac00-9e4e5dde2b9f","Type":"ContainerDied","Data":"5d5c8e8e10580501b501c94784d168a78ceea84e1fae71e77aff589899772ff9"} Dec 01 10:45:03 crc kubenswrapper[4867]: I1201 10:45:03.991710 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-qwmcd" Dec 01 10:45:03 crc kubenswrapper[4867]: I1201 10:45:03.991717 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d5c8e8e10580501b501c94784d168a78ceea84e1fae71e77aff589899772ff9" Dec 01 10:45:04 crc kubenswrapper[4867]: I1201 10:45:04.423797 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b"] Dec 01 10:45:04 crc kubenswrapper[4867]: I1201 10:45:04.434463 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409720-qxf5b"] Dec 01 10:45:04 crc kubenswrapper[4867]: I1201 10:45:04.838115 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b13efbea-f333-4f57-8c7a-814104fcd7f5" path="/var/lib/kubelet/pods/b13efbea-f333-4f57-8c7a-814104fcd7f5/volumes" Dec 01 10:45:11 crc kubenswrapper[4867]: I1201 10:45:11.827533 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:45:11 crc kubenswrapper[4867]: E1201 10:45:11.828442 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:45:22 crc kubenswrapper[4867]: I1201 10:45:22.827883 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:45:22 crc kubenswrapper[4867]: E1201 10:45:22.828727 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:45:23 crc kubenswrapper[4867]: I1201 10:45:23.397411 4867 scope.go:117] "RemoveContainer" containerID="ace37c939462124d9bbd7f93c6ae765c3739ca31aa3e77629716d619e5f31f4d" Dec 01 10:45:35 crc kubenswrapper[4867]: I1201 10:45:35.828704 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:45:35 crc kubenswrapper[4867]: E1201 10:45:35.831370 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:45:46 crc kubenswrapper[4867]: I1201 10:45:46.827165 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:45:46 crc kubenswrapper[4867]: E1201 10:45:46.827944 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:46:00 crc kubenswrapper[4867]: I1201 10:46:00.826825 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:46:00 crc kubenswrapper[4867]: E1201 10:46:00.827591 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:46:13 crc kubenswrapper[4867]: I1201 10:46:13.826896 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:46:13 crc kubenswrapper[4867]: E1201 10:46:13.827654 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:46:24 crc kubenswrapper[4867]: I1201 10:46:24.827140 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:46:24 crc kubenswrapper[4867]: E1201 10:46:24.827981 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:46:36 crc kubenswrapper[4867]: I1201 10:46:36.828579 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:46:36 crc kubenswrapper[4867]: E1201 10:46:36.830251 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:46:49 crc kubenswrapper[4867]: I1201 10:46:49.827316 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:46:49 crc kubenswrapper[4867]: E1201 10:46:49.829095 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:47:01 crc kubenswrapper[4867]: I1201 10:47:01.826913 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:47:01 crc kubenswrapper[4867]: E1201 10:47:01.829034 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:47:12 crc kubenswrapper[4867]: I1201 10:47:12.838455 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:47:12 crc kubenswrapper[4867]: E1201 10:47:12.839404 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:47:23 crc kubenswrapper[4867]: I1201 10:47:23.829273 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:47:23 crc kubenswrapper[4867]: E1201 10:47:23.830083 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:47:36 crc kubenswrapper[4867]: I1201 10:47:36.828142 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:47:36 crc kubenswrapper[4867]: E1201 10:47:36.828866 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a" Dec 01 10:47:48 crc kubenswrapper[4867]: I1201 10:47:48.835239 4867 scope.go:117] "RemoveContainer" containerID="0c6a9fa4b625708428609f15dba823938ea79aeece0eb49820b89cffa34081af" Dec 01 10:47:48 crc kubenswrapper[4867]: E1201 10:47:48.836529 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mt9t2_openshift-machine-config-operator(cd237749-4cea-4ff6-a374-8da70f9c879a)\"" pod="openshift-machine-config-operator/machine-config-daemon-mt9t2" podUID="cd237749-4cea-4ff6-a374-8da70f9c879a"